Dec 09 14:58:09 crc systemd[1]: Starting Kubernetes Kubelet... Dec 09 14:58:09 crc restorecon[4554]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 14:58:09 crc restorecon[4554]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 14:58:09 crc restorecon[4554]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 09 14:58:10 crc kubenswrapper[4555]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 09 14:58:10 crc kubenswrapper[4555]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 09 14:58:10 crc kubenswrapper[4555]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 09 14:58:10 crc kubenswrapper[4555]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 09 14:58:10 crc kubenswrapper[4555]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 09 14:58:10 crc kubenswrapper[4555]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.036636 4555 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.038943 4555 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.038960 4555 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.038965 4555 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.038969 4555 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.038973 4555 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.038977 4555 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.038980 4555 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.038984 4555 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.038988 4555 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.038991 4555 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.038996 4555 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039000 4555 feature_gate.go:330] unrecognized feature gate: Example Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039009 4555 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039013 4555 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039017 4555 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039020 4555 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039023 4555 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039026 4555 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039030 4555 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039035 4555 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039038 4555 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039041 4555 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039044 4555 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039047 4555 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039052 4555 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039056 4555 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039059 4555 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039062 4555 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039065 4555 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039069 4555 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039072 4555 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039077 4555 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039081 4555 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039084 4555 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039087 4555 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039090 4555 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039093 4555 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039096 4555 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039100 4555 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039104 4555 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039108 4555 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039112 4555 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039117 4555 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039121 4555 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039124 4555 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039128 4555 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039131 4555 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039135 4555 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039138 4555 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039141 4555 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039144 4555 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039147 4555 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039151 4555 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039154 4555 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039157 4555 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039160 4555 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039163 4555 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039166 4555 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039170 4555 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039173 4555 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039176 4555 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039179 4555 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039182 4555 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039186 4555 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039189 4555 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039192 4555 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039195 4555 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039199 4555 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039202 4555 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039205 4555 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.039209 4555 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039279 4555 flags.go:64] FLAG: --address="0.0.0.0" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039287 4555 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039293 4555 flags.go:64] FLAG: --anonymous-auth="true" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039313 4555 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039319 4555 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039322 4555 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039327 4555 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039332 4555 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039336 4555 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039340 4555 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039345 4555 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039349 4555 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039354 4555 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039358 4555 flags.go:64] FLAG: --cgroup-root="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039362 4555 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039365 4555 flags.go:64] FLAG: --client-ca-file="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039369 4555 flags.go:64] FLAG: --cloud-config="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039373 4555 flags.go:64] FLAG: --cloud-provider="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039376 4555 flags.go:64] FLAG: --cluster-dns="[]" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039382 4555 flags.go:64] FLAG: --cluster-domain="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039385 4555 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039389 4555 flags.go:64] FLAG: --config-dir="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039392 4555 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039396 4555 flags.go:64] FLAG: --container-log-max-files="5" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039402 4555 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039405 4555 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039409 4555 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039413 4555 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039416 4555 flags.go:64] FLAG: --contention-profiling="false" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039420 4555 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039424 4555 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039428 4555 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039431 4555 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039436 4555 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039440 4555 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039443 4555 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039447 4555 flags.go:64] FLAG: --enable-load-reader="false" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039450 4555 flags.go:64] FLAG: --enable-server="true" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039454 4555 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039459 4555 flags.go:64] FLAG: --event-burst="100" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039464 4555 flags.go:64] FLAG: --event-qps="50" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039468 4555 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039471 4555 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039475 4555 flags.go:64] FLAG: --eviction-hard="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039480 4555 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039483 4555 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039487 4555 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039491 4555 flags.go:64] FLAG: --eviction-soft="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039495 4555 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039499 4555 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039502 4555 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039508 4555 flags.go:64] FLAG: --experimental-mounter-path="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039512 4555 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039516 4555 flags.go:64] FLAG: --fail-swap-on="true" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039519 4555 flags.go:64] FLAG: --feature-gates="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039524 4555 flags.go:64] FLAG: --file-check-frequency="20s" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039528 4555 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039532 4555 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039535 4555 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039539 4555 flags.go:64] FLAG: --healthz-port="10248" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039543 4555 flags.go:64] FLAG: --help="false" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039547 4555 flags.go:64] FLAG: --hostname-override="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039551 4555 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039554 4555 flags.go:64] FLAG: --http-check-frequency="20s" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039558 4555 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039562 4555 flags.go:64] FLAG: --image-credential-provider-config="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039565 4555 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039569 4555 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039572 4555 flags.go:64] FLAG: --image-service-endpoint="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039576 4555 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039580 4555 flags.go:64] FLAG: --kube-api-burst="100" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039584 4555 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039587 4555 flags.go:64] FLAG: --kube-api-qps="50" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039591 4555 flags.go:64] FLAG: --kube-reserved="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039596 4555 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039600 4555 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039604 4555 flags.go:64] FLAG: --kubelet-cgroups="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039607 4555 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039611 4555 flags.go:64] FLAG: --lock-file="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039614 4555 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039617 4555 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039621 4555 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039627 4555 flags.go:64] FLAG: --log-json-split-stream="false" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039630 4555 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039635 4555 flags.go:64] FLAG: --log-text-split-stream="false" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039639 4555 flags.go:64] FLAG: --logging-format="text" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039642 4555 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039647 4555 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039651 4555 flags.go:64] FLAG: --manifest-url="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039655 4555 flags.go:64] FLAG: --manifest-url-header="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039659 4555 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039663 4555 flags.go:64] FLAG: --max-open-files="1000000" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039680 4555 flags.go:64] FLAG: --max-pods="110" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039684 4555 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039688 4555 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039692 4555 flags.go:64] FLAG: --memory-manager-policy="None" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039696 4555 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039700 4555 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039703 4555 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039707 4555 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039716 4555 flags.go:64] FLAG: --node-status-max-images="50" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039719 4555 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039724 4555 flags.go:64] FLAG: --oom-score-adj="-999" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039727 4555 flags.go:64] FLAG: --pod-cidr="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039731 4555 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039737 4555 flags.go:64] FLAG: --pod-manifest-path="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039740 4555 flags.go:64] FLAG: --pod-max-pids="-1" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039744 4555 flags.go:64] FLAG: --pods-per-core="0" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039748 4555 flags.go:64] FLAG: --port="10250" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039751 4555 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039755 4555 flags.go:64] FLAG: --provider-id="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039758 4555 flags.go:64] FLAG: --qos-reserved="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039762 4555 flags.go:64] FLAG: --read-only-port="10255" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039766 4555 flags.go:64] FLAG: --register-node="true" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039770 4555 flags.go:64] FLAG: --register-schedulable="true" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039774 4555 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039784 4555 flags.go:64] FLAG: --registry-burst="10" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039788 4555 flags.go:64] FLAG: --registry-qps="5" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039792 4555 flags.go:64] FLAG: --reserved-cpus="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039795 4555 flags.go:64] FLAG: --reserved-memory="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039801 4555 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039804 4555 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039808 4555 flags.go:64] FLAG: --rotate-certificates="false" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039812 4555 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039816 4555 flags.go:64] FLAG: --runonce="false" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039829 4555 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039833 4555 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039837 4555 flags.go:64] FLAG: --seccomp-default="false" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039841 4555 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039844 4555 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039848 4555 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039852 4555 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039856 4555 flags.go:64] FLAG: --storage-driver-password="root" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039859 4555 flags.go:64] FLAG: --storage-driver-secure="false" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039863 4555 flags.go:64] FLAG: --storage-driver-table="stats" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039866 4555 flags.go:64] FLAG: --storage-driver-user="root" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039870 4555 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039875 4555 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039880 4555 flags.go:64] FLAG: --system-cgroups="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039884 4555 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039891 4555 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039895 4555 flags.go:64] FLAG: --tls-cert-file="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039898 4555 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039904 4555 flags.go:64] FLAG: --tls-min-version="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039908 4555 flags.go:64] FLAG: --tls-private-key-file="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039911 4555 flags.go:64] FLAG: --topology-manager-policy="none" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039915 4555 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039919 4555 flags.go:64] FLAG: --topology-manager-scope="container" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039923 4555 flags.go:64] FLAG: --v="2" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039928 4555 flags.go:64] FLAG: --version="false" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039933 4555 flags.go:64] FLAG: --vmodule="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039937 4555 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.039941 4555 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040019 4555 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040024 4555 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040028 4555 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040032 4555 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040036 4555 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040039 4555 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040043 4555 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040047 4555 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040051 4555 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040055 4555 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040058 4555 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040062 4555 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040065 4555 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040068 4555 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040071 4555 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040074 4555 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040077 4555 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040080 4555 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040083 4555 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040087 4555 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040090 4555 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040093 4555 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040096 4555 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040099 4555 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040102 4555 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040106 4555 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040109 4555 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040113 4555 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040117 4555 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040121 4555 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040124 4555 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040128 4555 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040132 4555 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040135 4555 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040139 4555 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040142 4555 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040145 4555 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040148 4555 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040152 4555 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040155 4555 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040159 4555 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040162 4555 feature_gate.go:330] unrecognized feature gate: Example Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040167 4555 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040170 4555 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040173 4555 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040177 4555 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040180 4555 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040183 4555 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040186 4555 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040189 4555 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040192 4555 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040195 4555 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040198 4555 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040202 4555 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040205 4555 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040208 4555 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040212 4555 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040215 4555 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040218 4555 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040222 4555 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040225 4555 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040228 4555 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040231 4555 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040234 4555 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040238 4555 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040241 4555 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040245 4555 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040248 4555 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040251 4555 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040255 4555 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.040258 4555 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.040602 4555 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.048100 4555 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.048126 4555 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048198 4555 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048208 4555 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048213 4555 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048216 4555 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048220 4555 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048225 4555 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048231 4555 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048235 4555 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048238 4555 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048241 4555 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048245 4555 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048248 4555 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048251 4555 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048261 4555 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048265 4555 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048269 4555 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048273 4555 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048277 4555 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048281 4555 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048285 4555 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048289 4555 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048292 4555 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048296 4555 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048313 4555 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048317 4555 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048320 4555 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048324 4555 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048327 4555 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048331 4555 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048334 4555 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048338 4555 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048342 4555 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048345 4555 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048348 4555 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048351 4555 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048355 4555 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048358 4555 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048361 4555 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048364 4555 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048367 4555 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048370 4555 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048373 4555 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048377 4555 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048380 4555 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048383 4555 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048386 4555 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048390 4555 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048395 4555 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048398 4555 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048401 4555 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048404 4555 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048408 4555 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048411 4555 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048414 4555 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048418 4555 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048421 4555 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048424 4555 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048427 4555 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048431 4555 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048434 4555 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048437 4555 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048440 4555 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048443 4555 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048447 4555 feature_gate.go:330] unrecognized feature gate: Example Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048450 4555 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048453 4555 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048456 4555 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048459 4555 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048462 4555 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048465 4555 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048468 4555 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.048475 4555 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048591 4555 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048598 4555 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048601 4555 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048604 4555 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048607 4555 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048611 4555 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048614 4555 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048617 4555 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048621 4555 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048624 4555 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048628 4555 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048633 4555 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048637 4555 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048640 4555 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048644 4555 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048647 4555 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048651 4555 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048654 4555 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048657 4555 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048661 4555 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048664 4555 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048667 4555 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048670 4555 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048673 4555 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048677 4555 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048680 4555 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048684 4555 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048688 4555 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048691 4555 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048695 4555 feature_gate.go:330] unrecognized feature gate: Example Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048698 4555 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048701 4555 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048705 4555 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048708 4555 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048711 4555 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048714 4555 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048717 4555 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048720 4555 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048724 4555 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048727 4555 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048730 4555 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048734 4555 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048737 4555 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048741 4555 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048744 4555 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048747 4555 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048750 4555 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048754 4555 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048758 4555 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048763 4555 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048767 4555 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048770 4555 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048774 4555 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048777 4555 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048780 4555 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048783 4555 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048786 4555 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048790 4555 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048793 4555 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048796 4555 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048799 4555 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048802 4555 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048805 4555 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048809 4555 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048812 4555 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048816 4555 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048828 4555 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048832 4555 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048835 4555 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048838 4555 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.048841 4555 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.048847 4555 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.049377 4555 server.go:940] "Client rotation is on, will bootstrap in background" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.052020 4555 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.052100 4555 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.053745 4555 server.go:997] "Starting client certificate rotation" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.053768 4555 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.053908 4555 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-14 15:54:54.944411219 +0000 UTC Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.053998 4555 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.064232 4555 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.065522 4555 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 09 14:58:10 crc kubenswrapper[4555]: E1209 14:58:10.065627 4555 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.26.32:6443: connect: connection refused" logger="UnhandledError" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.074836 4555 log.go:25] "Validated CRI v1 runtime API" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.091271 4555 log.go:25] "Validated CRI v1 image API" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.092688 4555 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.095781 4555 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-09-14-54-51-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.095804 4555 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:49 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm:{mountpoint:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm major:0 minor:42 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:50 fsType:tmpfs blockSize:0} overlay_0-43:{mountpoint:/var/lib/containers/storage/overlay/94b752e0a51c0134b00ddef6dc7a933a9d7c1d9bdc88a18dae4192a0d557d623/merged major:0 minor:43 fsType:overlay blockSize:0}] Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.110596 4555 manager.go:217] Machine: {Timestamp:2025-12-09 14:58:10.109101196 +0000 UTC m=+0.245813914 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2445406 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:bd9f288e-1f46-4e96-a21f-7eb84ab89d9c BootID:73387e21-fe11-4735-b3c1-0a3cfb47f8fd Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:49 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm DeviceMajor:0 DeviceMinor:42 Capacity:65536000 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-43 DeviceMajor:0 DeviceMinor:43 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:50 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:97:36:e2 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:enp3s0 MacAddress:fa:16:3e:97:36:e2 Speed:-1 Mtu:1500} {Name:enp7s0 MacAddress:fa:16:3e:30:86:1c Speed:-1 Mtu:1440} {Name:enp7s0.20 MacAddress:52:54:00:c7:65:00 Speed:-1 Mtu:1436} {Name:enp7s0.21 MacAddress:52:54:00:44:f0:fa Speed:-1 Mtu:1436} {Name:enp7s0.22 MacAddress:52:54:00:3e:22:6d Speed:-1 Mtu:1436} {Name:eth10 MacAddress:3a:81:14:97:45:b1 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:76:be:80:e5:3a:bd Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:65536 Type:Data Level:1} {Id:0 Size:65536 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:65536 Type:Data Level:1} {Id:1 Size:65536 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:65536 Type:Data Level:1} {Id:10 Size:65536 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:65536 Type:Data Level:1} {Id:11 Size:65536 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:65536 Type:Data Level:1} {Id:2 Size:65536 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:65536 Type:Data Level:1} {Id:3 Size:65536 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:65536 Type:Data Level:1} {Id:4 Size:65536 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:65536 Type:Data Level:1} {Id:5 Size:65536 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:65536 Type:Data Level:1} {Id:6 Size:65536 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:65536 Type:Data Level:1} {Id:7 Size:65536 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:65536 Type:Data Level:1} {Id:8 Size:65536 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:65536 Type:Data Level:1} {Id:9 Size:65536 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.110764 4555 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.110882 4555 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.111687 4555 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.111859 4555 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.111884 4555 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.112041 4555 topology_manager.go:138] "Creating topology manager with none policy" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.112051 4555 container_manager_linux.go:303] "Creating device plugin manager" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.112390 4555 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.112416 4555 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.112756 4555 state_mem.go:36] "Initialized new in-memory state store" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.112825 4555 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.114420 4555 kubelet.go:418] "Attempting to sync node with API server" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.114439 4555 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.114469 4555 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.114479 4555 kubelet.go:324] "Adding apiserver pod source" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.114488 4555 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.116546 4555 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.117173 4555 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.117275 4555 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.26.32:6443: connect: connection refused Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.117271 4555 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.26.32:6443: connect: connection refused Dec 09 14:58:10 crc kubenswrapper[4555]: E1209 14:58:10.117350 4555 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.26.32:6443: connect: connection refused" logger="UnhandledError" Dec 09 14:58:10 crc kubenswrapper[4555]: E1209 14:58:10.117356 4555 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.26.32:6443: connect: connection refused" logger="UnhandledError" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.118591 4555 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.119396 4555 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.119432 4555 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.119444 4555 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.119450 4555 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.119468 4555 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.119479 4555 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.119490 4555 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.119510 4555 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.119522 4555 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.119535 4555 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.119544 4555 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.119550 4555 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.120029 4555 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.120480 4555 server.go:1280] "Started kubelet" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.120679 4555 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.26.32:6443: connect: connection refused Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.121193 4555 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.121202 4555 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 09 14:58:10 crc systemd[1]: Started Kubernetes Kubelet. Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.121756 4555 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.123337 4555 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.123365 4555 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.123466 4555 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.123491 4555 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 09 14:58:10 crc kubenswrapper[4555]: E1209 14:58:10.123659 4555 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.123696 4555 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.123384 4555 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 09:24:28.194065603 +0000 UTC Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.125869 4555 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.125888 4555 factory.go:55] Registering systemd factory Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.125896 4555 factory.go:221] Registration of the systemd container factory successfully Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.127106 4555 server.go:460] "Adding debug handlers to kubelet server" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.127870 4555 factory.go:153] Registering CRI-O factory Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.127892 4555 factory.go:221] Registration of the crio container factory successfully Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.128014 4555 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.26.32:6443: connect: connection refused Dec 09 14:58:10 crc kubenswrapper[4555]: E1209 14:58:10.128061 4555 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.26.32:6443: connect: connection refused" logger="UnhandledError" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.128184 4555 factory.go:103] Registering Raw factory Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.128285 4555 manager.go:1196] Started watching for new ooms in manager Dec 09 14:58:10 crc kubenswrapper[4555]: E1209 14:58:10.128427 4555 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.32:6443: connect: connection refused" interval="200ms" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.129123 4555 manager.go:319] Starting recovery of all containers Dec 09 14:58:10 crc kubenswrapper[4555]: E1209 14:58:10.129177 4555 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 192.168.26.32:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187f93fc255d8744 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-09 14:58:10.120451908 +0000 UTC m=+0.257164626,LastTimestamp:2025-12-09 14:58:10.120451908 +0000 UTC m=+0.257164626,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.132921 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.132956 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.132967 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.132975 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.132983 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.132993 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133001 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133009 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133019 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133027 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133035 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133043 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133051 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133063 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133070 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133078 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133086 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133093 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133101 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133109 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133117 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133126 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133133 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133141 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133149 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133157 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133166 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133176 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133184 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133191 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133199 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133221 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133229 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133236 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133245 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133252 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133260 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133283 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133291 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133318 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133328 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133337 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133345 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133353 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133362 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133369 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133378 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133385 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133393 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133401 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133408 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133416 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133452 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133462 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133470 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133478 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133487 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133494 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133502 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133510 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133529 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133536 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133544 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133552 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133561 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133568 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133575 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133583 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133591 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133599 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133606 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133613 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133623 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133630 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133639 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133648 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133667 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133675 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133683 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133690 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133698 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133706 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133713 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133721 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133729 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133738 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133747 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133755 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133763 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133770 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133778 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133786 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133794 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133802 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133809 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133817 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133825 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133832 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133849 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133858 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133865 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133873 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133881 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.133889 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134686 4555 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134718 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134730 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134741 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134750 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134760 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134769 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134778 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134787 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134796 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134804 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134812 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134820 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134830 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134848 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134857 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134865 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134873 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134881 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134889 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134896 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134904 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134911 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134919 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134927 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134938 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134945 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134963 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134971 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134979 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134987 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.134995 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135002 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135010 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135018 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135026 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135034 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135042 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135050 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135059 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135068 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135075 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135083 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135090 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135098 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135106 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135114 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135121 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135144 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135151 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135181 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135193 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135201 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135208 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135215 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135222 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135231 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135238 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135246 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135256 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135264 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135271 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135282 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135290 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135315 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135324 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135332 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135345 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135353 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135361 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135368 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135376 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135383 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135392 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135425 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135438 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135446 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135456 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135464 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135472 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135492 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135500 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135507 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135517 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135525 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135532 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135540 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135547 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135555 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135568 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135575 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135584 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135591 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135616 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135626 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135635 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135642 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135651 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135659 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135668 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135675 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135683 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135691 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135700 4555 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135707 4555 reconstruct.go:97] "Volume reconstruction finished" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.135716 4555 reconciler.go:26] "Reconciler: start to sync state" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.143003 4555 manager.go:324] Recovery completed Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.151401 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.153016 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.153052 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.153061 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.154161 4555 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.154177 4555 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.154191 4555 state_mem.go:36] "Initialized new in-memory state store" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.158664 4555 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.158706 4555 policy_none.go:49] "None policy: Start" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.159127 4555 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.159146 4555 state_mem.go:35] "Initializing new in-memory state store" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.159852 4555 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.159886 4555 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.159909 4555 kubelet.go:2335] "Starting kubelet main sync loop" Dec 09 14:58:10 crc kubenswrapper[4555]: E1209 14:58:10.159943 4555 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.161451 4555 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.26.32:6443: connect: connection refused Dec 09 14:58:10 crc kubenswrapper[4555]: E1209 14:58:10.161507 4555 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.26.32:6443: connect: connection refused" logger="UnhandledError" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.202005 4555 manager.go:334] "Starting Device Plugin manager" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.202042 4555 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.202051 4555 server.go:79] "Starting device plugin registration server" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.202272 4555 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.202288 4555 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.202400 4555 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.202460 4555 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.202473 4555 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 09 14:58:10 crc kubenswrapper[4555]: E1209 14:58:10.207544 4555 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.261036 4555 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.261122 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.261872 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.261898 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.261906 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.261992 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.262232 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.262335 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.262466 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.262488 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.262496 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.262569 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.262651 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.262685 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.263105 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.263125 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.263134 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.263216 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.263248 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.263256 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.263374 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.263436 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.263493 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.263601 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.263657 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.263760 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.264239 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.264266 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.264273 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.264404 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.264423 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.264430 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.264492 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.264598 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.264622 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.264935 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.264953 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.264961 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.265045 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.265067 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.265151 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.265189 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.265199 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.265499 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.265591 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.265662 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.302664 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.303589 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.303613 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.303621 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.303634 4555 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 14:58:10 crc kubenswrapper[4555]: E1209 14:58:10.303932 4555 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.32:6443: connect: connection refused" node="crc" Dec 09 14:58:10 crc kubenswrapper[4555]: E1209 14:58:10.329081 4555 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.32:6443: connect: connection refused" interval="400ms" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.337366 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.337409 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.337426 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.337454 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.337472 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.337487 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.337500 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.337514 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.337528 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.337541 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.337553 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.337573 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.337588 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.337600 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.337611 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.438674 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.438721 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.438775 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.438717 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.438845 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.438860 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.438873 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.438926 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.438951 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.438994 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.439008 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.439007 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.439027 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.439043 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.439082 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.439115 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.439135 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.439116 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.439135 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.439174 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.439186 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.439159 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.439235 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.439255 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.439270 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.439326 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.439296 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.439346 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.439356 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.439385 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.504969 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.506114 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.506143 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.506151 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.506169 4555 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 14:58:10 crc kubenswrapper[4555]: E1209 14:58:10.506549 4555 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.32:6443: connect: connection refused" node="crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.594205 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.608957 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.614994 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-a9e76f4e8dd9e81f06a29faa49221d256e70877b370858b1a72e8be4e5d381c1 WatchSource:0}: Error finding container a9e76f4e8dd9e81f06a29faa49221d256e70877b370858b1a72e8be4e5d381c1: Status 404 returned error can't find the container with id a9e76f4e8dd9e81f06a29faa49221d256e70877b370858b1a72e8be4e5d381c1 Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.622983 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-1f2a74633817c2c0c028f4f78808a5cc5c09094db4dfc1305057896b7896b4f2 WatchSource:0}: Error finding container 1f2a74633817c2c0c028f4f78808a5cc5c09094db4dfc1305057896b7896b4f2: Status 404 returned error can't find the container with id 1f2a74633817c2c0c028f4f78808a5cc5c09094db4dfc1305057896b7896b4f2 Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.634413 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.642120 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-fb92798c022be5bc5a9d3b0ae7df3f158ce53cea2ac15c411c41511be423b705 WatchSource:0}: Error finding container fb92798c022be5bc5a9d3b0ae7df3f158ce53cea2ac15c411c41511be423b705: Status 404 returned error can't find the container with id fb92798c022be5bc5a9d3b0ae7df3f158ce53cea2ac15c411c41511be423b705 Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.655003 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.658902 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.661330 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-c075ae0abf51cb12152c4b3a188ae6b8657201dd228a865dd61c31e892f7c6fd WatchSource:0}: Error finding container c075ae0abf51cb12152c4b3a188ae6b8657201dd228a865dd61c31e892f7c6fd: Status 404 returned error can't find the container with id c075ae0abf51cb12152c4b3a188ae6b8657201dd228a865dd61c31e892f7c6fd Dec 09 14:58:10 crc kubenswrapper[4555]: W1209 14:58:10.666037 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-9b6d5e02d03de0e240f31fb9bd6cd331985148eb769416d1af49a7b7b327b074 WatchSource:0}: Error finding container 9b6d5e02d03de0e240f31fb9bd6cd331985148eb769416d1af49a7b7b327b074: Status 404 returned error can't find the container with id 9b6d5e02d03de0e240f31fb9bd6cd331985148eb769416d1af49a7b7b327b074 Dec 09 14:58:10 crc kubenswrapper[4555]: E1209 14:58:10.730451 4555 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.32:6443: connect: connection refused" interval="800ms" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.907317 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.908154 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.908186 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.908195 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:10 crc kubenswrapper[4555]: I1209 14:58:10.908215 4555 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 14:58:10 crc kubenswrapper[4555]: E1209 14:58:10.908569 4555 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.32:6443: connect: connection refused" node="crc" Dec 09 14:58:11 crc kubenswrapper[4555]: W1209 14:58:11.016780 4555 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.26.32:6443: connect: connection refused Dec 09 14:58:11 crc kubenswrapper[4555]: E1209 14:58:11.017041 4555 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.26.32:6443: connect: connection refused" logger="UnhandledError" Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.121842 4555 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.26.32:6443: connect: connection refused Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.126107 4555 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 06:21:39.820502745 +0000 UTC Dec 09 14:58:11 crc kubenswrapper[4555]: W1209 14:58:11.136546 4555 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.26.32:6443: connect: connection refused Dec 09 14:58:11 crc kubenswrapper[4555]: E1209 14:58:11.136594 4555 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.26.32:6443: connect: connection refused" logger="UnhandledError" Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.164405 4555 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="a30d185f354ca39197fe30559783400a7d218ef0a162fd5fa5e5039d1946d127" exitCode=0 Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.164475 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"a30d185f354ca39197fe30559783400a7d218ef0a162fd5fa5e5039d1946d127"} Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.164564 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9b6d5e02d03de0e240f31fb9bd6cd331985148eb769416d1af49a7b7b327b074"} Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.164641 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.165448 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f"} Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.165472 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c075ae0abf51cb12152c4b3a188ae6b8657201dd228a865dd61c31e892f7c6fd"} Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.165506 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.165529 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.165537 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.166637 4555 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d" exitCode=0 Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.166690 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d"} Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.166710 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fb92798c022be5bc5a9d3b0ae7df3f158ce53cea2ac15c411c41511be423b705"} Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.166775 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.167253 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.167274 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.167282 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.167970 4555 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766" exitCode=0 Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.168149 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766"} Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.168250 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1f2a74633817c2c0c028f4f78808a5cc5c09094db4dfc1305057896b7896b4f2"} Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.168538 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.168677 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.170320 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.170343 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.170351 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.170780 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.170809 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.170818 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.171085 4555 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="7803e784d08586ae7bb48c47ef9e647dd0e53d5eb079ac8f4cfd1b74bcd15564" exitCode=0 Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.171111 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"7803e784d08586ae7bb48c47ef9e647dd0e53d5eb079ac8f4cfd1b74bcd15564"} Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.171126 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"a9e76f4e8dd9e81f06a29faa49221d256e70877b370858b1a72e8be4e5d381c1"} Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.171170 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.171665 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.171686 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.171695 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:11 crc kubenswrapper[4555]: W1209 14:58:11.214511 4555 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.26.32:6443: connect: connection refused Dec 09 14:58:11 crc kubenswrapper[4555]: E1209 14:58:11.214576 4555 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.26.32:6443: connect: connection refused" logger="UnhandledError" Dec 09 14:58:11 crc kubenswrapper[4555]: W1209 14:58:11.335296 4555 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.26.32:6443: connect: connection refused Dec 09 14:58:11 crc kubenswrapper[4555]: E1209 14:58:11.335413 4555 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.26.32:6443: connect: connection refused" logger="UnhandledError" Dec 09 14:58:11 crc kubenswrapper[4555]: E1209 14:58:11.531399 4555 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.32:6443: connect: connection refused" interval="1.6s" Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.709297 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.710274 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.710321 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.710331 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:11 crc kubenswrapper[4555]: I1209 14:58:11.710352 4555 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 14:58:11 crc kubenswrapper[4555]: E1209 14:58:11.710848 4555 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.32:6443: connect: connection refused" node="crc" Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.126830 4555 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 18:47:55.194300922 +0000 UTC Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.126875 4555 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 795h49m43.067428181s for next certificate rotation Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.175977 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b"} Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.176008 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27"} Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.176018 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738"} Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.176027 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed"} Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.176034 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42"} Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.176109 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.176709 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.176735 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.176743 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.177692 4555 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19" exitCode=0 Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.177740 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19"} Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.177818 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.178399 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.178428 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.178436 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.179329 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"6504dd2b92e68c220a759270fa99ddc38e403112a4c10b4eaf5d15539769c3fd"} Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.179428 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.179995 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.180018 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.180027 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.181639 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"88b44fa2c57434247fea11c1e1fbed34ad1df532e7e6bba0623634541322627f"} Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.181717 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"aa09f69008b086bb69e768437b40d896803a11f82f6949202ed0a456f3a213be"} Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.181785 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c5c1f863d399da019632ceb8d698cf8aa186d0f533ff8a49da7c66dd11086253"} Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.181871 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.182390 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.182466 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.182524 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.183840 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9"} Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.183865 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75"} Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.183877 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1"} Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.183995 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.184529 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.184559 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.184568 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:12 crc kubenswrapper[4555]: I1209 14:58:12.258555 4555 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 09 14:58:13 crc kubenswrapper[4555]: I1209 14:58:13.186602 4555 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820" exitCode=0 Dec 09 14:58:13 crc kubenswrapper[4555]: I1209 14:58:13.186671 4555 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 14:58:13 crc kubenswrapper[4555]: I1209 14:58:13.186673 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820"} Dec 09 14:58:13 crc kubenswrapper[4555]: I1209 14:58:13.186698 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:13 crc kubenswrapper[4555]: I1209 14:58:13.186806 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:13 crc kubenswrapper[4555]: I1209 14:58:13.187094 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:13 crc kubenswrapper[4555]: I1209 14:58:13.187276 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:13 crc kubenswrapper[4555]: I1209 14:58:13.187296 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:13 crc kubenswrapper[4555]: I1209 14:58:13.187324 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:13 crc kubenswrapper[4555]: I1209 14:58:13.187478 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:13 crc kubenswrapper[4555]: I1209 14:58:13.187494 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:13 crc kubenswrapper[4555]: I1209 14:58:13.187502 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:13 crc kubenswrapper[4555]: I1209 14:58:13.187689 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:13 crc kubenswrapper[4555]: I1209 14:58:13.187721 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:13 crc kubenswrapper[4555]: I1209 14:58:13.187730 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:13 crc kubenswrapper[4555]: I1209 14:58:13.311890 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:13 crc kubenswrapper[4555]: I1209 14:58:13.312665 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:13 crc kubenswrapper[4555]: I1209 14:58:13.312855 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:13 crc kubenswrapper[4555]: I1209 14:58:13.312864 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:13 crc kubenswrapper[4555]: I1209 14:58:13.312881 4555 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 14:58:13 crc kubenswrapper[4555]: I1209 14:58:13.887360 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:13 crc kubenswrapper[4555]: I1209 14:58:13.892164 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:14 crc kubenswrapper[4555]: I1209 14:58:14.192071 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4"} Dec 09 14:58:14 crc kubenswrapper[4555]: I1209 14:58:14.192106 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6"} Dec 09 14:58:14 crc kubenswrapper[4555]: I1209 14:58:14.192117 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b"} Dec 09 14:58:14 crc kubenswrapper[4555]: I1209 14:58:14.192126 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2"} Dec 09 14:58:14 crc kubenswrapper[4555]: I1209 14:58:14.192133 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6"} Dec 09 14:58:14 crc kubenswrapper[4555]: I1209 14:58:14.192135 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:14 crc kubenswrapper[4555]: I1209 14:58:14.192216 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:14 crc kubenswrapper[4555]: I1209 14:58:14.192905 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:14 crc kubenswrapper[4555]: I1209 14:58:14.192916 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:14 crc kubenswrapper[4555]: I1209 14:58:14.192926 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:14 crc kubenswrapper[4555]: I1209 14:58:14.192935 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:14 crc kubenswrapper[4555]: I1209 14:58:14.192947 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:14 crc kubenswrapper[4555]: I1209 14:58:14.192936 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:14 crc kubenswrapper[4555]: I1209 14:58:14.393924 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:14 crc kubenswrapper[4555]: I1209 14:58:14.543904 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:14 crc kubenswrapper[4555]: I1209 14:58:14.543979 4555 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 14:58:14 crc kubenswrapper[4555]: I1209 14:58:14.544005 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:14 crc kubenswrapper[4555]: I1209 14:58:14.544706 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:14 crc kubenswrapper[4555]: I1209 14:58:14.544759 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:14 crc kubenswrapper[4555]: I1209 14:58:14.544770 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:15 crc kubenswrapper[4555]: I1209 14:58:15.147010 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:15 crc kubenswrapper[4555]: I1209 14:58:15.193803 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:15 crc kubenswrapper[4555]: I1209 14:58:15.193830 4555 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 14:58:15 crc kubenswrapper[4555]: I1209 14:58:15.193875 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:15 crc kubenswrapper[4555]: I1209 14:58:15.194496 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:15 crc kubenswrapper[4555]: I1209 14:58:15.194524 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:15 crc kubenswrapper[4555]: I1209 14:58:15.194534 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:15 crc kubenswrapper[4555]: I1209 14:58:15.194496 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:15 crc kubenswrapper[4555]: I1209 14:58:15.194599 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:15 crc kubenswrapper[4555]: I1209 14:58:15.194608 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:16 crc kubenswrapper[4555]: I1209 14:58:16.195479 4555 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 14:58:16 crc kubenswrapper[4555]: I1209 14:58:16.195506 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:16 crc kubenswrapper[4555]: I1209 14:58:16.196098 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:16 crc kubenswrapper[4555]: I1209 14:58:16.196131 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:16 crc kubenswrapper[4555]: I1209 14:58:16.196141 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:17 crc kubenswrapper[4555]: I1209 14:58:17.102981 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:17 crc kubenswrapper[4555]: I1209 14:58:17.103112 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:17 crc kubenswrapper[4555]: I1209 14:58:17.103986 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:17 crc kubenswrapper[4555]: I1209 14:58:17.104034 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:17 crc kubenswrapper[4555]: I1209 14:58:17.104048 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:17 crc kubenswrapper[4555]: I1209 14:58:17.272670 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 09 14:58:17 crc kubenswrapper[4555]: I1209 14:58:17.272832 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:17 crc kubenswrapper[4555]: I1209 14:58:17.273563 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:17 crc kubenswrapper[4555]: I1209 14:58:17.273593 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:17 crc kubenswrapper[4555]: I1209 14:58:17.273605 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:17 crc kubenswrapper[4555]: I1209 14:58:17.547433 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:17 crc kubenswrapper[4555]: I1209 14:58:17.547552 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:17 crc kubenswrapper[4555]: I1209 14:58:17.548421 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:17 crc kubenswrapper[4555]: I1209 14:58:17.548461 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:17 crc kubenswrapper[4555]: I1209 14:58:17.548471 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:19 crc kubenswrapper[4555]: I1209 14:58:19.334836 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 14:58:19 crc kubenswrapper[4555]: I1209 14:58:19.335183 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:19 crc kubenswrapper[4555]: I1209 14:58:19.337351 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:19 crc kubenswrapper[4555]: I1209 14:58:19.337394 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:19 crc kubenswrapper[4555]: I1209 14:58:19.337405 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:19 crc kubenswrapper[4555]: I1209 14:58:19.867370 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 09 14:58:19 crc kubenswrapper[4555]: I1209 14:58:19.867501 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:19 crc kubenswrapper[4555]: I1209 14:58:19.868500 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:19 crc kubenswrapper[4555]: I1209 14:58:19.868538 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:19 crc kubenswrapper[4555]: I1209 14:58:19.868548 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:20 crc kubenswrapper[4555]: E1209 14:58:20.207621 4555 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 09 14:58:21 crc kubenswrapper[4555]: I1209 14:58:21.104350 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:21 crc kubenswrapper[4555]: I1209 14:58:21.104492 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:21 crc kubenswrapper[4555]: I1209 14:58:21.105255 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:21 crc kubenswrapper[4555]: I1209 14:58:21.105288 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:21 crc kubenswrapper[4555]: I1209 14:58:21.105322 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:21 crc kubenswrapper[4555]: I1209 14:58:21.107475 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:21 crc kubenswrapper[4555]: I1209 14:58:21.203551 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:21 crc kubenswrapper[4555]: I1209 14:58:21.204154 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:21 crc kubenswrapper[4555]: I1209 14:58:21.204181 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:21 crc kubenswrapper[4555]: I1209 14:58:21.204191 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:21 crc kubenswrapper[4555]: I1209 14:58:21.514722 4555 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 09 14:58:21 crc kubenswrapper[4555]: I1209 14:58:21.514773 4555 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 09 14:58:22 crc kubenswrapper[4555]: I1209 14:58:22.107101 4555 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]log ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]etcd ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/openshift.io-api-request-count-filter ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/openshift.io-startkubeinformers ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/generic-apiserver-start-informers ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/priority-and-fairness-config-consumer ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/priority-and-fairness-filter ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/start-apiextensions-informers ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/start-apiextensions-controllers ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/crd-informer-synced ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/start-system-namespaces-controller ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/start-cluster-authentication-info-controller ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/start-legacy-token-tracking-controller ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/start-service-ip-repair-controllers ok Dec 09 14:58:22 crc kubenswrapper[4555]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Dec 09 14:58:22 crc kubenswrapper[4555]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/priority-and-fairness-config-producer ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/bootstrap-controller ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/start-kube-aggregator-informers ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/apiservice-status-local-available-controller ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/apiservice-status-remote-available-controller ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/apiservice-registration-controller ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/apiservice-wait-for-first-sync ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/apiservice-discovery-controller ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/kube-apiserver-autoregistration ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]autoregister-completion ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/apiservice-openapi-controller ok Dec 09 14:58:22 crc kubenswrapper[4555]: [+]poststarthook/apiservice-openapiv3-controller ok Dec 09 14:58:22 crc kubenswrapper[4555]: livez check failed Dec 09 14:58:22 crc kubenswrapper[4555]: I1209 14:58:22.107147 4555 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 14:58:24 crc kubenswrapper[4555]: I1209 14:58:24.105325 4555 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 09 14:58:24 crc kubenswrapper[4555]: I1209 14:58:24.105393 4555 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 09 14:58:25 crc kubenswrapper[4555]: I1209 14:58:25.147995 4555 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 09 14:58:25 crc kubenswrapper[4555]: I1209 14:58:25.148181 4555 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 09 14:58:26 crc kubenswrapper[4555]: I1209 14:58:26.440274 4555 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 09 14:58:26 crc kubenswrapper[4555]: I1209 14:58:26.440827 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 09 14:58:26 crc kubenswrapper[4555]: E1209 14:58:26.510905 4555 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Dec 09 14:58:26 crc kubenswrapper[4555]: I1209 14:58:26.513287 4555 trace.go:236] Trace[1449563518]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Dec-2025 14:58:13.093) (total time: 13420ms): Dec 09 14:58:26 crc kubenswrapper[4555]: Trace[1449563518]: ---"Objects listed" error: 13419ms (14:58:26.513) Dec 09 14:58:26 crc kubenswrapper[4555]: Trace[1449563518]: [13.420022093s] [13.420022093s] END Dec 09 14:58:26 crc kubenswrapper[4555]: I1209 14:58:26.513442 4555 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 09 14:58:26 crc kubenswrapper[4555]: E1209 14:58:26.513630 4555 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 09 14:58:26 crc kubenswrapper[4555]: I1209 14:58:26.513643 4555 trace.go:236] Trace[1497434993]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Dec-2025 14:58:13.162) (total time: 13350ms): Dec 09 14:58:26 crc kubenswrapper[4555]: Trace[1497434993]: ---"Objects listed" error: 13350ms (14:58:26.513) Dec 09 14:58:26 crc kubenswrapper[4555]: Trace[1497434993]: [13.350955437s] [13.350955437s] END Dec 09 14:58:26 crc kubenswrapper[4555]: I1209 14:58:26.513820 4555 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 09 14:58:26 crc kubenswrapper[4555]: I1209 14:58:26.513683 4555 trace.go:236] Trace[566901715]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Dec-2025 14:58:13.972) (total time: 12541ms): Dec 09 14:58:26 crc kubenswrapper[4555]: Trace[566901715]: ---"Objects listed" error: 12541ms (14:58:26.513) Dec 09 14:58:26 crc kubenswrapper[4555]: Trace[566901715]: [12.541570453s] [12.541570453s] END Dec 09 14:58:26 crc kubenswrapper[4555]: I1209 14:58:26.513930 4555 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 09 14:58:26 crc kubenswrapper[4555]: I1209 14:58:26.514073 4555 trace.go:236] Trace[191500702]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Dec-2025 14:58:14.282) (total time: 12231ms): Dec 09 14:58:26 crc kubenswrapper[4555]: Trace[191500702]: ---"Objects listed" error: 12230ms (14:58:26.513) Dec 09 14:58:26 crc kubenswrapper[4555]: Trace[191500702]: [12.23103327s] [12.23103327s] END Dec 09 14:58:26 crc kubenswrapper[4555]: I1209 14:58:26.514096 4555 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 09 14:58:26 crc kubenswrapper[4555]: I1209 14:58:26.515420 4555 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 09 14:58:26 crc kubenswrapper[4555]: I1209 14:58:26.517759 4555 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.107361 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.110296 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.126347 4555 apiserver.go:52] "Watching apiserver" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.128794 4555 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.129207 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-apiserver/kube-apiserver-crc"] Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.129585 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.129756 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.130108 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.130176 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.130234 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.130294 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.130390 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.130458 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.130527 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.132200 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.132247 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.132263 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.132328 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.132481 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.132841 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.132894 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.132981 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.132982 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.146819 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.163287 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.170669 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.178320 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.185052 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.192130 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.202714 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.209626 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.219019 4555 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.224483 4555 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.243988 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.250937 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.257525 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.263198 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.270138 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.275881 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.282153 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.288426 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.318887 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.318923 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.318943 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319030 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319049 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319067 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319081 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319098 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319115 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319111 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319132 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.319170 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 14:58:27.819145616 +0000 UTC m=+17.955858333 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319191 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319216 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319232 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319248 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319264 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319278 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319292 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319325 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319342 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319340 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319360 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319375 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319391 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319405 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319418 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319419 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319433 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319449 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319467 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319482 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319495 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319508 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319515 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319523 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319519 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319536 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319593 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319611 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319617 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319620 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319634 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319650 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319666 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319669 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319682 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319697 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319718 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319732 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319748 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319761 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319775 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319790 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319805 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319819 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319832 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319844 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319859 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319874 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319890 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319903 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319917 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319965 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319981 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319996 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320010 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320027 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320042 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320054 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320068 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320083 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320123 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320138 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320154 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320167 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320181 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320206 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320220 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320237 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320261 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320275 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320289 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320323 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320339 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320354 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320368 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320384 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320397 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320410 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320423 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320436 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320453 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320468 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320486 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320502 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320516 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320530 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320544 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320558 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320584 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320598 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320613 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320627 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320642 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320655 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320669 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320683 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320697 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320711 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320725 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320738 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320752 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320768 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320783 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320797 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320814 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320828 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320842 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320879 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320895 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319683 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320909 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319794 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319851 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319936 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319966 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.319977 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320103 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320211 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320270 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320334 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320431 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320450 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320457 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320482 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320598 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320610 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320660 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320723 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320766 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320799 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320849 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320900 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.320928 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.321786 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.321997 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.322003 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.322021 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.322080 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.322091 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.322080 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.322149 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.322217 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.322336 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.322395 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.322421 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.322607 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.322632 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.322640 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.322633 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.322685 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.322785 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.322789 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.322822 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.322909 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.322978 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.323010 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.323056 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.323075 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.323132 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.323159 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.323176 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.323330 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.323468 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.323519 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.323587 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.323618 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.323659 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.323708 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.323723 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.323756 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.323787 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.323891 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.323904 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.323933 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.324044 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.324142 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.324194 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.324204 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.324206 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.324329 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.324410 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.324482 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.324580 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.324591 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.324588 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.324688 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.324813 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.324854 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.324880 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.324986 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325167 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325179 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325210 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325258 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325410 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.321056 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325571 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325603 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325624 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325644 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325641 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325662 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325680 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325696 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325712 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325730 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325747 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325764 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325780 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325799 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325817 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325832 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325849 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325866 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325882 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325898 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325914 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325932 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325947 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325964 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325982 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.325996 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326014 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326032 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326047 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326066 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326083 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326100 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326119 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326135 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326152 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326166 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326169 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326182 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326203 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326220 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326236 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326253 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326274 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326289 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326326 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326344 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326360 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326378 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326394 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326412 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326426 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326444 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326436 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326575 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326607 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.326857 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.327782 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.327804 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.327826 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.327847 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.327863 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.327882 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.327912 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.327932 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.327947 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.327948 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.328377 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.328473 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.329393 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.329443 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.329465 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.329487 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.329506 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.329524 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.329544 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.329574 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.329693 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.329592 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.329818 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.329827 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.329854 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.329868 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.329880 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.329897 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.329916 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.330197 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.330520 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.330557 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.330697 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.330780 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.330879 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.330983 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.331632 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.331638 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.331677 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.331879 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.331909 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.332209 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.332352 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.332541 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.332645 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.332771 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.332837 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.332902 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.333082 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.333190 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.333332 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.333383 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.333687 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.333708 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.333711 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.333781 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.333727 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.334455 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.333738 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.334490 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.334486 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.334516 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.334535 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.334569 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.334585 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.334601 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.334814 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.334818 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.334955 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.334990 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.334999 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.335027 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.335096 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.335105 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.335099 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.335165 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.335188 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.335194 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.335285 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.335447 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.335479 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.335485 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.335614 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.335740 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.335799 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.335975 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.335497 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.335614 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.336161 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.336176 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.336179 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.336204 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.336225 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.336242 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.336275 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.336238 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.336290 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337572 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337636 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337656 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337712 4555 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337724 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337733 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337741 4555 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337750 4555 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337758 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337767 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337775 4555 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337782 4555 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337789 4555 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337797 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337804 4555 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337812 4555 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337820 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337828 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337837 4555 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337846 4555 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337854 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337862 4555 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337870 4555 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337878 4555 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337879 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337886 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337900 4555 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.336389 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.336414 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337926 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.336847 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.336903 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.336928 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.336965 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.337956 4555 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337126 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337164 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337974 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.336383 4555 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.338001 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 14:58:27.837988172 +0000 UTC m=+17.974700891 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.338031 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.338278 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337194 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337313 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337343 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337407 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.338330 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337459 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337527 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337662 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337758 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.337963 4555 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.338391 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.338424 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 14:58:27.838410105 +0000 UTC m=+17.975122823 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.338405 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.338797 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.338967 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339231 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.337917 4555 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339266 4555 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339277 4555 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339287 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339296 4555 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339320 4555 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339328 4555 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339336 4555 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339345 4555 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339353 4555 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339361 4555 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339369 4555 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339377 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339385 4555 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339393 4555 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339401 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339383 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339410 4555 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339435 4555 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339447 4555 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339456 4555 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339464 4555 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339493 4555 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339502 4555 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339511 4555 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339520 4555 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339529 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339538 4555 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339578 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339587 4555 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339595 4555 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339602 4555 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339609 4555 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339616 4555 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339624 4555 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339631 4555 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339657 4555 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339748 4555 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339757 4555 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339768 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339776 4555 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339784 4555 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339791 4555 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339818 4555 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339829 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339837 4555 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339845 4555 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339853 4555 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339861 4555 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339868 4555 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339876 4555 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339903 4555 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339911 4555 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339919 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339926 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339934 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339941 4555 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339951 4555 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339958 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339984 4555 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339992 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.339999 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340006 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340013 4555 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340020 4555 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340027 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340035 4555 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340061 4555 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340069 4555 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340076 4555 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340084 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340093 4555 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340101 4555 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340108 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340115 4555 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340141 4555 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340150 4555 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340157 4555 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340164 4555 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340172 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340179 4555 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340187 4555 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340195 4555 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340220 4555 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340228 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340236 4555 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340242 4555 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340249 4555 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340256 4555 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340264 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340271 4555 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340278 4555 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340312 4555 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340319 4555 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340326 4555 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340334 4555 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340342 4555 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340350 4555 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340357 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340383 4555 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340390 4555 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340398 4555 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340406 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340415 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340423 4555 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340430 4555 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340437 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340462 4555 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340470 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340478 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340486 4555 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340493 4555 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340500 4555 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340508 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340515 4555 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340545 4555 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340553 4555 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340568 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340575 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340583 4555 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340393 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.341348 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.342173 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.343727 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.340591 4555 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.344128 4555 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.344138 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.344146 4555 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.344154 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.344161 4555 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.344168 4555 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.344175 4555 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.344187 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.344195 4555 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.344202 4555 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.345621 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.346747 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.347122 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.347378 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.347726 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.348203 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.348362 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.348378 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.348389 4555 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.348435 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 14:58:27.848422418 +0000 UTC m=+17.985135137 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.349233 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.349431 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.349736 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.353104 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.353124 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.353134 4555 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.353173 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 14:58:27.853160507 +0000 UTC m=+17.989873225 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.353283 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.353313 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.355417 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.364906 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.369661 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.374512 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.377631 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445017 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445048 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445075 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445156 4555 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445185 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445196 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445204 4555 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445212 4555 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445221 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445228 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445235 4555 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445243 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445250 4555 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445258 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445266 4555 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445273 4555 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445280 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445287 4555 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445295 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445317 4555 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445325 4555 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445332 4555 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445339 4555 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445345 4555 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445353 4555 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445359 4555 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445366 4555 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445134 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445373 4555 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445384 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445393 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445400 4555 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445407 4555 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445415 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445423 4555 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445430 4555 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445437 4555 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445448 4555 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445455 4555 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445463 4555 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445471 4555 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445478 4555 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445485 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445492 4555 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.445499 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.449096 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 14:58:27 crc kubenswrapper[4555]: W1209 14:58:27.458726 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-cecd8d82dad4ef54952232d385fc5c6fad56e350c29f1d502c812f5b8e002a59 WatchSource:0}: Error finding container cecd8d82dad4ef54952232d385fc5c6fad56e350c29f1d502c812f5b8e002a59: Status 404 returned error can't find the container with id cecd8d82dad4ef54952232d385fc5c6fad56e350c29f1d502c812f5b8e002a59 Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.738296 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.743961 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 14:58:27 crc kubenswrapper[4555]: W1209 14:58:27.755836 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-feb3a43cf9d3571dadd96b44918fc829e46bd2df090189f15403063e18ceedc9 WatchSource:0}: Error finding container feb3a43cf9d3571dadd96b44918fc829e46bd2df090189f15403063e18ceedc9: Status 404 returned error can't find the container with id feb3a43cf9d3571dadd96b44918fc829e46bd2df090189f15403063e18ceedc9 Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.847377 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.847437 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.847460 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.847510 4555 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.847551 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 14:58:28.847540592 +0000 UTC m=+18.984253310 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.847733 4555 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.847772 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 14:58:28.847752661 +0000 UTC m=+18.984465379 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.847806 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 14:58:28.847791744 +0000 UTC m=+18.984504462 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.947850 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:58:27 crc kubenswrapper[4555]: I1209 14:58:27.947891 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.947991 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.948008 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.948020 4555 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.948040 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.948066 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.948080 4555 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.948068 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 14:58:28.94805471 +0000 UTC m=+19.084767428 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:27 crc kubenswrapper[4555]: E1209 14:58:27.948131 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 14:58:28.948117868 +0000 UTC m=+19.084830596 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.162894 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.163359 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.164364 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.164889 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.165720 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.166151 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.166704 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.167506 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.168031 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.168822 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.169231 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.170148 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.170583 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.171004 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.171776 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.172209 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.173006 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.173374 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.173840 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.174720 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.175100 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.175939 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.176324 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.177173 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.177571 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.178075 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.179000 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.179423 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.180227 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.180685 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.181402 4555 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.181494 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.182982 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.183770 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.184120 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.185376 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.185931 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.186891 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.187624 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.188623 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.189042 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.189921 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.190497 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.191384 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.191804 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.192614 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.193050 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.194039 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.194463 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.195211 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.195675 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.196460 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.196942 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.197355 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.218063 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"feb3a43cf9d3571dadd96b44918fc829e46bd2df090189f15403063e18ceedc9"} Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.219167 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb"} Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.219193 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b3fe3b9683f49b19ba8ee9f0112562888c19aca322b10358e8083015335e3370"} Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.220488 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349"} Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.220520 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8"} Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.220533 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"cecd8d82dad4ef54952232d385fc5c6fad56e350c29f1d502c812f5b8e002a59"} Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.231618 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:28Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.241136 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:28Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.249838 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:28Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.259402 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:28Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.267636 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:28Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.276717 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:28Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.284755 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:28Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.293361 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:28Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.302133 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:28Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.309832 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:28Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.318023 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:28Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.326531 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:28Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.334217 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:28Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.343156 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:28Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.852786 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.852840 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.852875 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:28 crc kubenswrapper[4555]: E1209 14:58:28.852944 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 14:58:30.852926648 +0000 UTC m=+20.989639376 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 14:58:28 crc kubenswrapper[4555]: E1209 14:58:28.852956 4555 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:58:28 crc kubenswrapper[4555]: E1209 14:58:28.852988 4555 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:58:28 crc kubenswrapper[4555]: E1209 14:58:28.852996 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 14:58:30.852988353 +0000 UTC m=+20.989701072 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:58:28 crc kubenswrapper[4555]: E1209 14:58:28.853074 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 14:58:30.853058957 +0000 UTC m=+20.989771685 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.953515 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:58:28 crc kubenswrapper[4555]: I1209 14:58:28.953553 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:58:28 crc kubenswrapper[4555]: E1209 14:58:28.953643 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:58:28 crc kubenswrapper[4555]: E1209 14:58:28.953665 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:58:28 crc kubenswrapper[4555]: E1209 14:58:28.953671 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:58:28 crc kubenswrapper[4555]: E1209 14:58:28.953709 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:58:28 crc kubenswrapper[4555]: E1209 14:58:28.953676 4555 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:28 crc kubenswrapper[4555]: E1209 14:58:28.953726 4555 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:28 crc kubenswrapper[4555]: E1209 14:58:28.953759 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 14:58:30.953750447 +0000 UTC m=+21.090463166 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:28 crc kubenswrapper[4555]: E1209 14:58:28.953774 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 14:58:30.95376821 +0000 UTC m=+21.090480929 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.160840 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.160879 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.160921 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:58:29 crc kubenswrapper[4555]: E1209 14:58:29.160933 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:58:29 crc kubenswrapper[4555]: E1209 14:58:29.161000 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:58:29 crc kubenswrapper[4555]: E1209 14:58:29.161043 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.714152 4555 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.715246 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.715282 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.715291 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.715368 4555 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.719490 4555 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.719627 4555 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.720236 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.720263 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.720271 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.720282 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.720289 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:29Z","lastTransitionTime":"2025-12-09T14:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:29 crc kubenswrapper[4555]: E1209 14:58:29.734331 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:29Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.736449 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.736483 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.736492 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.736503 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.736512 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:29Z","lastTransitionTime":"2025-12-09T14:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:29 crc kubenswrapper[4555]: E1209 14:58:29.744389 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:29Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.746548 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.746579 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.746588 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.746600 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.746609 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:29Z","lastTransitionTime":"2025-12-09T14:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:29 crc kubenswrapper[4555]: E1209 14:58:29.754073 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:29Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.756294 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.756339 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.756349 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.756361 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.756370 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:29Z","lastTransitionTime":"2025-12-09T14:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:29 crc kubenswrapper[4555]: E1209 14:58:29.767245 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:29Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.769905 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.769935 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.769945 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.769956 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.769964 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:29Z","lastTransitionTime":"2025-12-09T14:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:29 crc kubenswrapper[4555]: E1209 14:58:29.777482 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:29Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:29 crc kubenswrapper[4555]: E1209 14:58:29.777590 4555 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.778664 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.778686 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.778695 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.778705 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.778712 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:29Z","lastTransitionTime":"2025-12-09T14:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.880811 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.880841 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.880850 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.880862 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.880870 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:29Z","lastTransitionTime":"2025-12-09T14:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.884371 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.892009 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.893152 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:29Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.894532 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.901598 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:29Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.908788 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:29Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.917971 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:29Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.925678 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:29Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.933609 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:29Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.940956 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:29Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.954596 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:29Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.963133 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:29Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.970730 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:29Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.977978 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:29Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.983134 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.983168 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.983177 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.983191 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.983200 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:29Z","lastTransitionTime":"2025-12-09T14:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.985291 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:29Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:29 crc kubenswrapper[4555]: I1209 14:58:29.993497 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:29Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.001159 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.008703 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.084832 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.084858 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.084867 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.084879 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.084887 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:30Z","lastTransitionTime":"2025-12-09T14:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.170881 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.180193 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.186692 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.186728 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.186737 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.186750 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.186759 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:30Z","lastTransitionTime":"2025-12-09T14:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.187950 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.195684 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.210600 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.219896 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.225034 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a"} Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.230206 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:30 crc kubenswrapper[4555]: E1209 14:58:30.231633 4555 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.237283 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.244569 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.252533 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.259696 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.267021 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.278665 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.286049 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.288109 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.288154 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.288164 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.288177 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.288185 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:30Z","lastTransitionTime":"2025-12-09T14:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.293618 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.300467 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.389805 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.389831 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.389840 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.389851 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.389859 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:30Z","lastTransitionTime":"2025-12-09T14:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.491761 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.491789 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.491797 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.491809 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.491818 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:30Z","lastTransitionTime":"2025-12-09T14:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.593686 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.593729 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.593738 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.593751 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.593760 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:30Z","lastTransitionTime":"2025-12-09T14:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.695771 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.695800 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.695827 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.695838 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.695846 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:30Z","lastTransitionTime":"2025-12-09T14:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.797543 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.797568 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.797577 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.797586 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.797595 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:30Z","lastTransitionTime":"2025-12-09T14:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.867070 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.867113 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.867141 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:30 crc kubenswrapper[4555]: E1209 14:58:30.867201 4555 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:58:30 crc kubenswrapper[4555]: E1209 14:58:30.867228 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 14:58:34.867206698 +0000 UTC m=+25.003919416 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 14:58:30 crc kubenswrapper[4555]: E1209 14:58:30.867253 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 14:58:34.867245942 +0000 UTC m=+25.003958659 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:58:30 crc kubenswrapper[4555]: E1209 14:58:30.867360 4555 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:58:30 crc kubenswrapper[4555]: E1209 14:58:30.867395 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 14:58:34.867387197 +0000 UTC m=+25.004099925 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.899615 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.899646 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.899654 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.899667 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.899675 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:30Z","lastTransitionTime":"2025-12-09T14:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.968189 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:58:30 crc kubenswrapper[4555]: I1209 14:58:30.968227 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:58:30 crc kubenswrapper[4555]: E1209 14:58:30.968323 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:58:30 crc kubenswrapper[4555]: E1209 14:58:30.968330 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:58:30 crc kubenswrapper[4555]: E1209 14:58:30.968338 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:58:30 crc kubenswrapper[4555]: E1209 14:58:30.968348 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:58:30 crc kubenswrapper[4555]: E1209 14:58:30.968349 4555 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:30 crc kubenswrapper[4555]: E1209 14:58:30.968357 4555 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:30 crc kubenswrapper[4555]: E1209 14:58:30.968387 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 14:58:34.968376998 +0000 UTC m=+25.105089716 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:30 crc kubenswrapper[4555]: E1209 14:58:30.968399 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 14:58:34.96839408 +0000 UTC m=+25.105106798 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.001704 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.001737 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.001746 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.001760 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.001770 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:31Z","lastTransitionTime":"2025-12-09T14:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.103157 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.103180 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.103189 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.103200 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.103208 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:31Z","lastTransitionTime":"2025-12-09T14:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.107477 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.109903 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.113570 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.117701 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.125844 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.154174 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.160426 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:31 crc kubenswrapper[4555]: E1209 14:58:31.160525 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.160748 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:58:31 crc kubenswrapper[4555]: E1209 14:58:31.160795 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.160836 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:58:31 crc kubenswrapper[4555]: E1209 14:58:31.160871 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.166040 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.182481 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.192391 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.201057 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.205002 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.205028 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.205038 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.205051 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.205060 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:31Z","lastTransitionTime":"2025-12-09T14:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.209555 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.222420 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.231425 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.239487 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.247531 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.256757 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.266447 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.275720 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.284822 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.293942 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.307168 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.307197 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.307205 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.307218 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.307226 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:31Z","lastTransitionTime":"2025-12-09T14:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.408913 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.408941 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.408951 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.408962 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.408970 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:31Z","lastTransitionTime":"2025-12-09T14:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.510257 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.510279 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.510287 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.510297 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.510322 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:31Z","lastTransitionTime":"2025-12-09T14:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.611630 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.611650 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.611658 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.611669 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.611677 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:31Z","lastTransitionTime":"2025-12-09T14:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.713878 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.713904 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.713914 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.713924 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.713933 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:31Z","lastTransitionTime":"2025-12-09T14:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.815120 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.815159 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.815169 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.815178 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.815187 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:31Z","lastTransitionTime":"2025-12-09T14:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.917443 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.917478 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.917488 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.917510 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:31 crc kubenswrapper[4555]: I1209 14:58:31.917519 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:31Z","lastTransitionTime":"2025-12-09T14:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.018940 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.018963 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.018971 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.018998 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.019007 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:32Z","lastTransitionTime":"2025-12-09T14:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.120613 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.120638 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.120646 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.120658 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.120668 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:32Z","lastTransitionTime":"2025-12-09T14:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.222078 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.222110 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.222117 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.222129 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.222137 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:32Z","lastTransitionTime":"2025-12-09T14:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.324003 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.324075 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.324088 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.324112 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.324134 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:32Z","lastTransitionTime":"2025-12-09T14:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.425970 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.426009 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.426019 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.426034 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.426045 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:32Z","lastTransitionTime":"2025-12-09T14:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.453021 4555 csr.go:261] certificate signing request csr-5q2tj is approved, waiting to be issued Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.484249 4555 csr.go:257] certificate signing request csr-5q2tj is issued Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.527611 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.527652 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.527664 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.527679 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.527688 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:32Z","lastTransitionTime":"2025-12-09T14:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.632377 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.632423 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.632433 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.632447 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.632458 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:32Z","lastTransitionTime":"2025-12-09T14:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.734119 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.734154 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.734162 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.734177 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.734186 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:32Z","lastTransitionTime":"2025-12-09T14:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.825459 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-gjjcm"] Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.825805 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.826088 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-x6p6q"] Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.826426 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-t6kl9"] Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.826634 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-t6kl9" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.826626 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.831808 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.831822 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.832587 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.833111 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.833439 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.833594 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.833693 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.834119 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.834129 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.834145 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.834158 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.834146 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.834492 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.835510 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.835533 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.835544 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.835555 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.835563 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:32Z","lastTransitionTime":"2025-12-09T14:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.849607 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.864762 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.880687 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.884011 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c97d4dc5-ce47-4a7b-9511-7168db0ee4b4-hosts-file\") pod \"node-resolver-t6kl9\" (UID: \"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\") " pod="openshift-dns/node-resolver-t6kl9" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.884064 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-cnibin\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.884081 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3e34542c-29e8-43d3-add3-42aa080e0c8a-cni-binary-copy\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.884105 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqc9g\" (UniqueName: \"kubernetes.io/projected/3e34542c-29e8-43d3-add3-42aa080e0c8a-kube-api-access-sqc9g\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.884160 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-multus-socket-dir-parent\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.884186 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-host-var-lib-cni-bin\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.884207 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-hostroot\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.884223 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3e34542c-29e8-43d3-add3-42aa080e0c8a-multus-daemon-config\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.884237 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-etc-kubernetes\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.884254 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-host-run-netns\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.884269 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-host-run-multus-certs\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.884313 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-host-run-k8s-cni-cncf-io\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.884341 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-multus-cni-dir\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.884354 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-host-var-lib-kubelet\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.884368 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47xhl\" (UniqueName: \"kubernetes.io/projected/c97d4dc5-ce47-4a7b-9511-7168db0ee4b4-kube-api-access-47xhl\") pod \"node-resolver-t6kl9\" (UID: \"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\") " pod="openshift-dns/node-resolver-t6kl9" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.884386 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-host-var-lib-cni-multus\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.884403 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-multus-conf-dir\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.884423 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b929ab12-7d60-4e9f-a1a7-3e2d8b086d36-rootfs\") pod \"machine-config-daemon-x6p6q\" (UID: \"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\") " pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.884438 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b929ab12-7d60-4e9f-a1a7-3e2d8b086d36-mcd-auth-proxy-config\") pod \"machine-config-daemon-x6p6q\" (UID: \"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\") " pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.884457 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff6nf\" (UniqueName: \"kubernetes.io/projected/b929ab12-7d60-4e9f-a1a7-3e2d8b086d36-kube-api-access-ff6nf\") pod \"machine-config-daemon-x6p6q\" (UID: \"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\") " pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.884493 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-system-cni-dir\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.884510 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-os-release\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.884551 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b929ab12-7d60-4e9f-a1a7-3e2d8b086d36-proxy-tls\") pod \"machine-config-daemon-x6p6q\" (UID: \"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\") " pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.888043 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.902836 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.920902 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.937721 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.937752 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.937762 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.937777 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.937786 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:32Z","lastTransitionTime":"2025-12-09T14:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.938560 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.959918 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.974695 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.984053 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985289 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-system-cni-dir\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985335 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-os-release\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985361 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b929ab12-7d60-4e9f-a1a7-3e2d8b086d36-proxy-tls\") pod \"machine-config-daemon-x6p6q\" (UID: \"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\") " pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985378 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c97d4dc5-ce47-4a7b-9511-7168db0ee4b4-hosts-file\") pod \"node-resolver-t6kl9\" (UID: \"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\") " pod="openshift-dns/node-resolver-t6kl9" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985392 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-cnibin\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985396 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-system-cni-dir\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985407 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3e34542c-29e8-43d3-add3-42aa080e0c8a-cni-binary-copy\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985457 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqc9g\" (UniqueName: \"kubernetes.io/projected/3e34542c-29e8-43d3-add3-42aa080e0c8a-kube-api-access-sqc9g\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985495 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-multus-socket-dir-parent\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985511 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-host-var-lib-cni-bin\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985529 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-hostroot\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985529 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-os-release\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985544 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3e34542c-29e8-43d3-add3-42aa080e0c8a-multus-daemon-config\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985559 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-etc-kubernetes\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985572 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-host-var-lib-cni-bin\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985578 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-host-run-netns\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985594 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-host-run-multus-certs\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985607 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-cnibin\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985602 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c97d4dc5-ce47-4a7b-9511-7168db0ee4b4-hosts-file\") pod \"node-resolver-t6kl9\" (UID: \"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\") " pod="openshift-dns/node-resolver-t6kl9" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985643 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-host-run-k8s-cni-cncf-io\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985632 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-hostroot\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985623 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-host-run-k8s-cni-cncf-io\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985678 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-host-run-multus-certs\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985682 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-etc-kubernetes\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985710 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-host-run-netns\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985724 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-multus-cni-dir\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985754 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-multus-socket-dir-parent\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985798 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-host-var-lib-kubelet\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985833 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47xhl\" (UniqueName: \"kubernetes.io/projected/c97d4dc5-ce47-4a7b-9511-7168db0ee4b4-kube-api-access-47xhl\") pod \"node-resolver-t6kl9\" (UID: \"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\") " pod="openshift-dns/node-resolver-t6kl9" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985843 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-multus-cni-dir\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985853 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-host-var-lib-cni-multus\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985878 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-host-var-lib-kubelet\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985947 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-multus-conf-dir\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985961 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-host-var-lib-cni-multus\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985966 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e34542c-29e8-43d3-add3-42aa080e0c8a-multus-conf-dir\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.985994 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3e34542c-29e8-43d3-add3-42aa080e0c8a-cni-binary-copy\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.986006 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b929ab12-7d60-4e9f-a1a7-3e2d8b086d36-rootfs\") pod \"machine-config-daemon-x6p6q\" (UID: \"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\") " pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.986039 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b929ab12-7d60-4e9f-a1a7-3e2d8b086d36-mcd-auth-proxy-config\") pod \"machine-config-daemon-x6p6q\" (UID: \"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\") " pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.986043 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b929ab12-7d60-4e9f-a1a7-3e2d8b086d36-rootfs\") pod \"machine-config-daemon-x6p6q\" (UID: \"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\") " pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.986060 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff6nf\" (UniqueName: \"kubernetes.io/projected/b929ab12-7d60-4e9f-a1a7-3e2d8b086d36-kube-api-access-ff6nf\") pod \"machine-config-daemon-x6p6q\" (UID: \"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\") " pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.986135 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3e34542c-29e8-43d3-add3-42aa080e0c8a-multus-daemon-config\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.986615 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b929ab12-7d60-4e9f-a1a7-3e2d8b086d36-mcd-auth-proxy-config\") pod \"machine-config-daemon-x6p6q\" (UID: \"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\") " pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" Dec 09 14:58:32 crc kubenswrapper[4555]: I1209 14:58:32.989582 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b929ab12-7d60-4e9f-a1a7-3e2d8b086d36-proxy-tls\") pod \"machine-config-daemon-x6p6q\" (UID: \"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\") " pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.001269 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff6nf\" (UniqueName: \"kubernetes.io/projected/b929ab12-7d60-4e9f-a1a7-3e2d8b086d36-kube-api-access-ff6nf\") pod \"machine-config-daemon-x6p6q\" (UID: \"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\") " pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.001365 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqc9g\" (UniqueName: \"kubernetes.io/projected/3e34542c-29e8-43d3-add3-42aa080e0c8a-kube-api-access-sqc9g\") pod \"multus-gjjcm\" (UID: \"3e34542c-29e8-43d3-add3-42aa080e0c8a\") " pod="openshift-multus/multus-gjjcm" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.006400 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47xhl\" (UniqueName: \"kubernetes.io/projected/c97d4dc5-ce47-4a7b-9511-7168db0ee4b4-kube-api-access-47xhl\") pod \"node-resolver-t6kl9\" (UID: \"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\") " pod="openshift-dns/node-resolver-t6kl9" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.007243 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.016619 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.026829 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.035350 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.040253 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.040309 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.040322 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.040337 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.040347 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:33Z","lastTransitionTime":"2025-12-09T14:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.045218 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.061742 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.073717 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.091074 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.105801 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.124189 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.136927 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gjjcm" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.138356 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.144153 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-t6kl9" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.144884 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.144908 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.144916 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.144933 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.144942 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:33Z","lastTransitionTime":"2025-12-09T14:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:33 crc kubenswrapper[4555]: W1209 14:58:33.146858 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e34542c_29e8_43d3_add3_42aa080e0c8a.slice/crio-1368e8407653239a27e8576cd89e856ba364c9a6be01fc9ef856358ead76a556 WatchSource:0}: Error finding container 1368e8407653239a27e8576cd89e856ba364c9a6be01fc9ef856358ead76a556: Status 404 returned error can't find the container with id 1368e8407653239a27e8576cd89e856ba364c9a6be01fc9ef856358ead76a556 Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.149638 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" Dec 09 14:58:33 crc kubenswrapper[4555]: W1209 14:58:33.155781 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc97d4dc5_ce47_4a7b_9511_7168db0ee4b4.slice/crio-fac8de8be7e3bb14f9dd9d6e1f102ddca26b0a27314d853605eb2ef5b161da14 WatchSource:0}: Error finding container fac8de8be7e3bb14f9dd9d6e1f102ddca26b0a27314d853605eb2ef5b161da14: Status 404 returned error can't find the container with id fac8de8be7e3bb14f9dd9d6e1f102ddca26b0a27314d853605eb2ef5b161da14 Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.159619 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.160972 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.161014 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:33 crc kubenswrapper[4555]: E1209 14:58:33.161067 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.161092 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:58:33 crc kubenswrapper[4555]: E1209 14:58:33.161133 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:58:33 crc kubenswrapper[4555]: E1209 14:58:33.161214 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:58:33 crc kubenswrapper[4555]: W1209 14:58:33.162332 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb929ab12_7d60_4e9f_a1a7_3e2d8b086d36.slice/crio-50648c18b483d69df7f7ba9c80be125e180456b635a74e59154f27c63e2f9b65 WatchSource:0}: Error finding container 50648c18b483d69df7f7ba9c80be125e180456b635a74e59154f27c63e2f9b65: Status 404 returned error can't find the container with id 50648c18b483d69df7f7ba9c80be125e180456b635a74e59154f27c63e2f9b65 Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.209851 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-tpsq2"] Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.210491 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.212033 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.213528 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.226635 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.231525 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-t6kl9" event={"ID":"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4","Type":"ContainerStarted","Data":"fac8de8be7e3bb14f9dd9d6e1f102ddca26b0a27314d853605eb2ef5b161da14"} Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.233115 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gjjcm" event={"ID":"3e34542c-29e8-43d3-add3-42aa080e0c8a","Type":"ContainerStarted","Data":"1368e8407653239a27e8576cd89e856ba364c9a6be01fc9ef856358ead76a556"} Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.234168 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" event={"ID":"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36","Type":"ContainerStarted","Data":"50648c18b483d69df7f7ba9c80be125e180456b635a74e59154f27c63e2f9b65"} Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.237614 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.247103 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.247132 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.247142 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.247156 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.247165 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:33Z","lastTransitionTime":"2025-12-09T14:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.249582 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.258686 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.269152 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.278489 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.289383 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dd8f66c1-da34-466f-ba53-4848c96b1287-cni-binary-copy\") pod \"multus-additional-cni-plugins-tpsq2\" (UID: \"dd8f66c1-da34-466f-ba53-4848c96b1287\") " pod="openshift-multus/multus-additional-cni-plugins-tpsq2" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.289462 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dd8f66c1-da34-466f-ba53-4848c96b1287-system-cni-dir\") pod \"multus-additional-cni-plugins-tpsq2\" (UID: \"dd8f66c1-da34-466f-ba53-4848c96b1287\") " pod="openshift-multus/multus-additional-cni-plugins-tpsq2" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.289518 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dd8f66c1-da34-466f-ba53-4848c96b1287-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tpsq2\" (UID: \"dd8f66c1-da34-466f-ba53-4848c96b1287\") " pod="openshift-multus/multus-additional-cni-plugins-tpsq2" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.289539 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrmwb\" (UniqueName: \"kubernetes.io/projected/dd8f66c1-da34-466f-ba53-4848c96b1287-kube-api-access-nrmwb\") pod \"multus-additional-cni-plugins-tpsq2\" (UID: \"dd8f66c1-da34-466f-ba53-4848c96b1287\") " pod="openshift-multus/multus-additional-cni-plugins-tpsq2" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.289565 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dd8f66c1-da34-466f-ba53-4848c96b1287-cnibin\") pod \"multus-additional-cni-plugins-tpsq2\" (UID: \"dd8f66c1-da34-466f-ba53-4848c96b1287\") " pod="openshift-multus/multus-additional-cni-plugins-tpsq2" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.289584 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dd8f66c1-da34-466f-ba53-4848c96b1287-os-release\") pod \"multus-additional-cni-plugins-tpsq2\" (UID: \"dd8f66c1-da34-466f-ba53-4848c96b1287\") " pod="openshift-multus/multus-additional-cni-plugins-tpsq2" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.289683 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dd8f66c1-da34-466f-ba53-4848c96b1287-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tpsq2\" (UID: \"dd8f66c1-da34-466f-ba53-4848c96b1287\") " pod="openshift-multus/multus-additional-cni-plugins-tpsq2" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.290881 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.300845 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.310787 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.321123 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.337690 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.348978 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.349097 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.349192 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.349264 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.349348 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:33Z","lastTransitionTime":"2025-12-09T14:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.355874 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.379168 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.390629 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dd8f66c1-da34-466f-ba53-4848c96b1287-cni-binary-copy\") pod \"multus-additional-cni-plugins-tpsq2\" (UID: \"dd8f66c1-da34-466f-ba53-4848c96b1287\") " pod="openshift-multus/multus-additional-cni-plugins-tpsq2" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.390678 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dd8f66c1-da34-466f-ba53-4848c96b1287-system-cni-dir\") pod \"multus-additional-cni-plugins-tpsq2\" (UID: \"dd8f66c1-da34-466f-ba53-4848c96b1287\") " pod="openshift-multus/multus-additional-cni-plugins-tpsq2" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.390703 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dd8f66c1-da34-466f-ba53-4848c96b1287-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tpsq2\" (UID: \"dd8f66c1-da34-466f-ba53-4848c96b1287\") " pod="openshift-multus/multus-additional-cni-plugins-tpsq2" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.390727 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrmwb\" (UniqueName: \"kubernetes.io/projected/dd8f66c1-da34-466f-ba53-4848c96b1287-kube-api-access-nrmwb\") pod \"multus-additional-cni-plugins-tpsq2\" (UID: \"dd8f66c1-da34-466f-ba53-4848c96b1287\") " pod="openshift-multus/multus-additional-cni-plugins-tpsq2" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.390758 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dd8f66c1-da34-466f-ba53-4848c96b1287-cnibin\") pod \"multus-additional-cni-plugins-tpsq2\" (UID: \"dd8f66c1-da34-466f-ba53-4848c96b1287\") " pod="openshift-multus/multus-additional-cni-plugins-tpsq2" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.390778 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dd8f66c1-da34-466f-ba53-4848c96b1287-os-release\") pod \"multus-additional-cni-plugins-tpsq2\" (UID: \"dd8f66c1-da34-466f-ba53-4848c96b1287\") " pod="openshift-multus/multus-additional-cni-plugins-tpsq2" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.390799 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dd8f66c1-da34-466f-ba53-4848c96b1287-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tpsq2\" (UID: \"dd8f66c1-da34-466f-ba53-4848c96b1287\") " pod="openshift-multus/multus-additional-cni-plugins-tpsq2" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.390935 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dd8f66c1-da34-466f-ba53-4848c96b1287-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tpsq2\" (UID: \"dd8f66c1-da34-466f-ba53-4848c96b1287\") " pod="openshift-multus/multus-additional-cni-plugins-tpsq2" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.391091 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dd8f66c1-da34-466f-ba53-4848c96b1287-cnibin\") pod \"multus-additional-cni-plugins-tpsq2\" (UID: \"dd8f66c1-da34-466f-ba53-4848c96b1287\") " pod="openshift-multus/multus-additional-cni-plugins-tpsq2" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.391165 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dd8f66c1-da34-466f-ba53-4848c96b1287-os-release\") pod \"multus-additional-cni-plugins-tpsq2\" (UID: \"dd8f66c1-da34-466f-ba53-4848c96b1287\") " pod="openshift-multus/multus-additional-cni-plugins-tpsq2" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.391094 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dd8f66c1-da34-466f-ba53-4848c96b1287-system-cni-dir\") pod \"multus-additional-cni-plugins-tpsq2\" (UID: \"dd8f66c1-da34-466f-ba53-4848c96b1287\") " pod="openshift-multus/multus-additional-cni-plugins-tpsq2" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.391631 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dd8f66c1-da34-466f-ba53-4848c96b1287-cni-binary-copy\") pod \"multus-additional-cni-plugins-tpsq2\" (UID: \"dd8f66c1-da34-466f-ba53-4848c96b1287\") " pod="openshift-multus/multus-additional-cni-plugins-tpsq2" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.391714 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dd8f66c1-da34-466f-ba53-4848c96b1287-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tpsq2\" (UID: \"dd8f66c1-da34-466f-ba53-4848c96b1287\") " pod="openshift-multus/multus-additional-cni-plugins-tpsq2" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.405590 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrmwb\" (UniqueName: \"kubernetes.io/projected/dd8f66c1-da34-466f-ba53-4848c96b1287-kube-api-access-nrmwb\") pod \"multus-additional-cni-plugins-tpsq2\" (UID: \"dd8f66c1-da34-466f-ba53-4848c96b1287\") " pod="openshift-multus/multus-additional-cni-plugins-tpsq2" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.451566 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.451674 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.451757 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.451837 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.451899 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:33Z","lastTransitionTime":"2025-12-09T14:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.485103 4555 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-09 14:53:32 +0000 UTC, rotation deadline is 2026-09-08 02:24:14.687838006 +0000 UTC Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.485170 4555 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6539h25m41.202670383s for next certificate rotation Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.520540 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" Dec 09 14:58:33 crc kubenswrapper[4555]: W1209 14:58:33.529062 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd8f66c1_da34_466f_ba53_4848c96b1287.slice/crio-84c57bb853057f0ec77e0c8b384a96c7914a872577372ad3314fd2e36bbb77d5 WatchSource:0}: Error finding container 84c57bb853057f0ec77e0c8b384a96c7914a872577372ad3314fd2e36bbb77d5: Status 404 returned error can't find the container with id 84c57bb853057f0ec77e0c8b384a96c7914a872577372ad3314fd2e36bbb77d5 Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.553777 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.553801 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.553809 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.553822 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.553832 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:33Z","lastTransitionTime":"2025-12-09T14:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.555677 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-p4bl4"] Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.556387 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.557847 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.557903 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.558070 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.558122 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.558504 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.559355 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.559514 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.568894 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.579154 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.589588 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.592072 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-run-netns\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.592110 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-systemd-units\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.592129 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-var-lib-openvswitch\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.592149 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-kubelet\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.592164 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.592181 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c87d128c-39da-4880-9efc-7d2e6319a854-ovnkube-script-lib\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.592290 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-slash\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.592352 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c87d128c-39da-4880-9efc-7d2e6319a854-ovn-node-metrics-cert\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.592384 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-cni-netd\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.592400 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c87d128c-39da-4880-9efc-7d2e6319a854-ovnkube-config\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.592425 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-etc-openvswitch\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.592455 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-node-log\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.592487 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-run-systemd\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.592506 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-run-ovn\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.592523 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p8n9\" (UniqueName: \"kubernetes.io/projected/c87d128c-39da-4880-9efc-7d2e6319a854-kube-api-access-7p8n9\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.592548 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-run-ovn-kubernetes\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.592590 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-run-openvswitch\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.592607 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-log-socket\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.592644 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-cni-bin\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.592658 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c87d128c-39da-4880-9efc-7d2e6319a854-env-overrides\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.598270 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.608936 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.617274 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.626023 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.637254 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.651978 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.655487 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.655520 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.655529 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.655544 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.655552 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:33Z","lastTransitionTime":"2025-12-09T14:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.667437 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.676878 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.684693 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.693618 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-run-systemd\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.693655 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-node-log\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.693676 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-run-ovn\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.693701 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p8n9\" (UniqueName: \"kubernetes.io/projected/c87d128c-39da-4880-9efc-7d2e6319a854-kube-api-access-7p8n9\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.693720 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-run-ovn-kubernetes\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.693753 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-run-openvswitch\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.693758 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-run-ovn\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.693789 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-run-openvswitch\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.693766 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-log-socket\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.693836 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-run-systemd\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.693793 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-log-socket\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.693869 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-cni-bin\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.693758 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-node-log\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.693855 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-cni-bin\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.693810 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-run-ovn-kubernetes\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.693902 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c87d128c-39da-4880-9efc-7d2e6319a854-env-overrides\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.693988 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-run-netns\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.694017 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-systemd-units\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.694034 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-var-lib-openvswitch\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.694052 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-kubelet\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.694071 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-systemd-units\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.694046 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-run-netns\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.694105 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-var-lib-openvswitch\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.694069 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.694119 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-kubelet\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.694140 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.694188 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c87d128c-39da-4880-9efc-7d2e6319a854-ovnkube-script-lib\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.694211 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-slash\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.694228 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-cni-netd\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.694229 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-slash\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.694244 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c87d128c-39da-4880-9efc-7d2e6319a854-ovnkube-config\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.694258 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c87d128c-39da-4880-9efc-7d2e6319a854-ovn-node-metrics-cert\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.694271 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-cni-netd\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.694292 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-etc-openvswitch\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.694274 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-etc-openvswitch\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.694420 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c87d128c-39da-4880-9efc-7d2e6319a854-env-overrides\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.694731 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c87d128c-39da-4880-9efc-7d2e6319a854-ovnkube-script-lib\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.694801 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c87d128c-39da-4880-9efc-7d2e6319a854-ovnkube-config\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.695906 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.696605 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c87d128c-39da-4880-9efc-7d2e6319a854-ovn-node-metrics-cert\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.705080 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:33Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.706492 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p8n9\" (UniqueName: \"kubernetes.io/projected/c87d128c-39da-4880-9efc-7d2e6319a854-kube-api-access-7p8n9\") pod \"ovnkube-node-p4bl4\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.758283 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.758327 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.758336 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.758352 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.758361 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:33Z","lastTransitionTime":"2025-12-09T14:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.860768 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.860800 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.860810 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.860825 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.860836 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:33Z","lastTransitionTime":"2025-12-09T14:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.866686 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:33 crc kubenswrapper[4555]: W1209 14:58:33.876112 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc87d128c_39da_4880_9efc_7d2e6319a854.slice/crio-a1ccd13de6fb64e2a009e568b924fd28d4b6edc43b474b7b6a7461ff1b7a1970 WatchSource:0}: Error finding container a1ccd13de6fb64e2a009e568b924fd28d4b6edc43b474b7b6a7461ff1b7a1970: Status 404 returned error can't find the container with id a1ccd13de6fb64e2a009e568b924fd28d4b6edc43b474b7b6a7461ff1b7a1970 Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.962918 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.962961 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.962971 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.962987 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:33 crc kubenswrapper[4555]: I1209 14:58:33.962998 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:33Z","lastTransitionTime":"2025-12-09T14:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.064947 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.064991 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.065002 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.065019 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.065030 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:34Z","lastTransitionTime":"2025-12-09T14:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.167059 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.167106 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.167118 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.167134 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.167145 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:34Z","lastTransitionTime":"2025-12-09T14:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.240000 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" event={"ID":"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36","Type":"ContainerStarted","Data":"04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9"} Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.240274 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" event={"ID":"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36","Type":"ContainerStarted","Data":"0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa"} Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.241913 4555 generic.go:334] "Generic (PLEG): container finished" podID="c87d128c-39da-4880-9efc-7d2e6319a854" containerID="c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd" exitCode=0 Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.241969 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" event={"ID":"c87d128c-39da-4880-9efc-7d2e6319a854","Type":"ContainerDied","Data":"c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd"} Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.241987 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" event={"ID":"c87d128c-39da-4880-9efc-7d2e6319a854","Type":"ContainerStarted","Data":"a1ccd13de6fb64e2a009e568b924fd28d4b6edc43b474b7b6a7461ff1b7a1970"} Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.243519 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-t6kl9" event={"ID":"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4","Type":"ContainerStarted","Data":"fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707"} Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.245111 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gjjcm" event={"ID":"3e34542c-29e8-43d3-add3-42aa080e0c8a","Type":"ContainerStarted","Data":"b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656"} Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.246820 4555 generic.go:334] "Generic (PLEG): container finished" podID="dd8f66c1-da34-466f-ba53-4848c96b1287" containerID="2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21" exitCode=0 Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.246849 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" event={"ID":"dd8f66c1-da34-466f-ba53-4848c96b1287","Type":"ContainerDied","Data":"2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21"} Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.246865 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" event={"ID":"dd8f66c1-da34-466f-ba53-4848c96b1287","Type":"ContainerStarted","Data":"84c57bb853057f0ec77e0c8b384a96c7914a872577372ad3314fd2e36bbb77d5"} Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.249848 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:34Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.258812 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:34Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.267640 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:34Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.269998 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.270027 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.270037 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.270051 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.270062 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:34Z","lastTransitionTime":"2025-12-09T14:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.278999 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:34Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.285985 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:34Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.295820 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:34Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.312244 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:34Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.323431 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:34Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.340701 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:34Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.357714 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:34Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.369673 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:34Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.372776 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.372816 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.372826 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.372840 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.372850 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:34Z","lastTransitionTime":"2025-12-09T14:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.379223 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:34Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.388766 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:34Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.403455 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:34Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.415453 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:34Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.426936 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:34Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.435782 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:34Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.443130 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:34Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.449914 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:34Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.462814 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:34Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.475013 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:34Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.475764 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.475808 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.475820 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.475840 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.475851 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:34Z","lastTransitionTime":"2025-12-09T14:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.484963 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:34Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.494837 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:34Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.504727 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:34Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.518585 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:34Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.532256 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:34Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.544757 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:34Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.555138 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:34Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.578384 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.578432 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.578444 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.578473 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.578483 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:34Z","lastTransitionTime":"2025-12-09T14:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.679985 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.680031 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.680046 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.680065 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.680075 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:34Z","lastTransitionTime":"2025-12-09T14:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.782602 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.782639 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.782648 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.782662 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.782671 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:34Z","lastTransitionTime":"2025-12-09T14:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.885590 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.885638 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.886013 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.886060 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.886075 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:34Z","lastTransitionTime":"2025-12-09T14:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.908146 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 14:58:34 crc kubenswrapper[4555]: E1209 14:58:34.908362 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 14:58:42.908338546 +0000 UTC m=+33.045051265 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.908478 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.908531 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:34 crc kubenswrapper[4555]: E1209 14:58:34.908650 4555 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:58:34 crc kubenswrapper[4555]: E1209 14:58:34.908689 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 14:58:42.908681901 +0000 UTC m=+33.045394620 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:58:34 crc kubenswrapper[4555]: E1209 14:58:34.908652 4555 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:58:34 crc kubenswrapper[4555]: E1209 14:58:34.908827 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 14:58:42.908803791 +0000 UTC m=+33.045516509 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.988746 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.988777 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.988785 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.988802 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:34 crc kubenswrapper[4555]: I1209 14:58:34.988811 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:34Z","lastTransitionTime":"2025-12-09T14:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.009286 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.009352 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:58:35 crc kubenswrapper[4555]: E1209 14:58:35.009469 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:58:35 crc kubenswrapper[4555]: E1209 14:58:35.009489 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:58:35 crc kubenswrapper[4555]: E1209 14:58:35.009495 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:58:35 crc kubenswrapper[4555]: E1209 14:58:35.009505 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:58:35 crc kubenswrapper[4555]: E1209 14:58:35.009513 4555 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:35 crc kubenswrapper[4555]: E1209 14:58:35.009517 4555 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:35 crc kubenswrapper[4555]: E1209 14:58:35.009570 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 14:58:43.009554953 +0000 UTC m=+33.146267671 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:35 crc kubenswrapper[4555]: E1209 14:58:35.009586 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 14:58:43.009580912 +0000 UTC m=+33.146293630 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.090810 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.090846 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.090856 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.090872 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.090881 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:35Z","lastTransitionTime":"2025-12-09T14:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.160671 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.160698 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.160711 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:58:35 crc kubenswrapper[4555]: E1209 14:58:35.160769 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:58:35 crc kubenswrapper[4555]: E1209 14:58:35.160885 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:58:35 crc kubenswrapper[4555]: E1209 14:58:35.161106 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.192790 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.192832 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.192842 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.192858 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.192875 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:35Z","lastTransitionTime":"2025-12-09T14:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.251333 4555 generic.go:334] "Generic (PLEG): container finished" podID="dd8f66c1-da34-466f-ba53-4848c96b1287" containerID="4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe" exitCode=0 Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.251396 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" event={"ID":"dd8f66c1-da34-466f-ba53-4848c96b1287","Type":"ContainerDied","Data":"4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe"} Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.254737 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" event={"ID":"c87d128c-39da-4880-9efc-7d2e6319a854","Type":"ContainerStarted","Data":"a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4"} Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.254779 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" event={"ID":"c87d128c-39da-4880-9efc-7d2e6319a854","Type":"ContainerStarted","Data":"ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25"} Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.254793 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" event={"ID":"c87d128c-39da-4880-9efc-7d2e6319a854","Type":"ContainerStarted","Data":"cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d"} Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.254803 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" event={"ID":"c87d128c-39da-4880-9efc-7d2e6319a854","Type":"ContainerStarted","Data":"4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac"} Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.254810 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" event={"ID":"c87d128c-39da-4880-9efc-7d2e6319a854","Type":"ContainerStarted","Data":"932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5"} Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.254818 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" event={"ID":"c87d128c-39da-4880-9efc-7d2e6319a854","Type":"ContainerStarted","Data":"64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07"} Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.262690 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:35Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.274036 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:35Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.284226 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:35Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.294485 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.294516 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.294525 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.294538 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.294552 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:35Z","lastTransitionTime":"2025-12-09T14:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.295523 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:35Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.308588 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:35Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.317228 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:35Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.325753 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:35Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.341683 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:35Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.356995 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-72msc"] Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.357525 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-72msc" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.357975 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:35Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.359101 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.360023 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.360163 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.360286 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.368373 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:35Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.379038 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:35Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.389687 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:35Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.396759 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.397007 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.397018 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.397036 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.397046 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:35Z","lastTransitionTime":"2025-12-09T14:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.398586 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:35Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.407504 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:35Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.412433 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b4f13f82-a44f-461d-93b0-916a7ed0b925-serviceca\") pod \"node-ca-72msc\" (UID: \"b4f13f82-a44f-461d-93b0-916a7ed0b925\") " pod="openshift-image-registry/node-ca-72msc" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.412491 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4f13f82-a44f-461d-93b0-916a7ed0b925-host\") pod \"node-ca-72msc\" (UID: \"b4f13f82-a44f-461d-93b0-916a7ed0b925\") " pod="openshift-image-registry/node-ca-72msc" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.412510 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2spqd\" (UniqueName: \"kubernetes.io/projected/b4f13f82-a44f-461d-93b0-916a7ed0b925-kube-api-access-2spqd\") pod \"node-ca-72msc\" (UID: \"b4f13f82-a44f-461d-93b0-916a7ed0b925\") " pod="openshift-image-registry/node-ca-72msc" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.417056 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:35Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.425367 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:35Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.434633 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:35Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.444839 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:35Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.452883 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:35Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.461418 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:35Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.476640 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:35Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.486703 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-72msc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4f13f82-a44f-461d-93b0-916a7ed0b925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2spqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-72msc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:35Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.499632 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.499665 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.499675 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.499691 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.499701 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:35Z","lastTransitionTime":"2025-12-09T14:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.503549 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:35Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.513833 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b4f13f82-a44f-461d-93b0-916a7ed0b925-serviceca\") pod \"node-ca-72msc\" (UID: \"b4f13f82-a44f-461d-93b0-916a7ed0b925\") " pod="openshift-image-registry/node-ca-72msc" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.513919 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4f13f82-a44f-461d-93b0-916a7ed0b925-host\") pod \"node-ca-72msc\" (UID: \"b4f13f82-a44f-461d-93b0-916a7ed0b925\") " pod="openshift-image-registry/node-ca-72msc" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.513951 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2spqd\" (UniqueName: \"kubernetes.io/projected/b4f13f82-a44f-461d-93b0-916a7ed0b925-kube-api-access-2spqd\") pod \"node-ca-72msc\" (UID: \"b4f13f82-a44f-461d-93b0-916a7ed0b925\") " pod="openshift-image-registry/node-ca-72msc" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.514007 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4f13f82-a44f-461d-93b0-916a7ed0b925-host\") pod \"node-ca-72msc\" (UID: \"b4f13f82-a44f-461d-93b0-916a7ed0b925\") " pod="openshift-image-registry/node-ca-72msc" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.514272 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:35Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.514678 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b4f13f82-a44f-461d-93b0-916a7ed0b925-serviceca\") pod \"node-ca-72msc\" (UID: \"b4f13f82-a44f-461d-93b0-916a7ed0b925\") " pod="openshift-image-registry/node-ca-72msc" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.530148 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2spqd\" (UniqueName: \"kubernetes.io/projected/b4f13f82-a44f-461d-93b0-916a7ed0b925-kube-api-access-2spqd\") pod \"node-ca-72msc\" (UID: \"b4f13f82-a44f-461d-93b0-916a7ed0b925\") " pod="openshift-image-registry/node-ca-72msc" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.552921 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:35Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.595297 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:35Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.602316 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.602351 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.602362 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.602378 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.602387 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:35Z","lastTransitionTime":"2025-12-09T14:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.634175 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:35Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.673660 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:35Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.687085 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-72msc" Dec 09 14:58:35 crc kubenswrapper[4555]: W1209 14:58:35.700700 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4f13f82_a44f_461d_93b0_916a7ed0b925.slice/crio-f01e7f37821396f18f12b30f2586f4216ec715fbf56c4c92517b97bb90be5191 WatchSource:0}: Error finding container f01e7f37821396f18f12b30f2586f4216ec715fbf56c4c92517b97bb90be5191: Status 404 returned error can't find the container with id f01e7f37821396f18f12b30f2586f4216ec715fbf56c4c92517b97bb90be5191 Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.704104 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.704133 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.704142 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.704155 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.704164 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:35Z","lastTransitionTime":"2025-12-09T14:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.712356 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:35Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.806059 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.806088 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.806099 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.806112 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.806123 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:35Z","lastTransitionTime":"2025-12-09T14:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.908263 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.908336 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.908350 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.908364 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:35 crc kubenswrapper[4555]: I1209 14:58:35.908373 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:35Z","lastTransitionTime":"2025-12-09T14:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.010212 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.010251 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.010260 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.010276 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.010285 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:36Z","lastTransitionTime":"2025-12-09T14:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.113266 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.113607 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.113617 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.113631 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.113639 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:36Z","lastTransitionTime":"2025-12-09T14:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.215626 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.215678 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.215688 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.215706 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.215717 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:36Z","lastTransitionTime":"2025-12-09T14:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.258258 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-72msc" event={"ID":"b4f13f82-a44f-461d-93b0-916a7ed0b925","Type":"ContainerStarted","Data":"06c787dc46c305e7388f67f85601eb3e724c501d4da0504d4f8fd569f533c634"} Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.258333 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-72msc" event={"ID":"b4f13f82-a44f-461d-93b0-916a7ed0b925","Type":"ContainerStarted","Data":"f01e7f37821396f18f12b30f2586f4216ec715fbf56c4c92517b97bb90be5191"} Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.260417 4555 generic.go:334] "Generic (PLEG): container finished" podID="dd8f66c1-da34-466f-ba53-4848c96b1287" containerID="dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced" exitCode=0 Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.260451 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" event={"ID":"dd8f66c1-da34-466f-ba53-4848c96b1287","Type":"ContainerDied","Data":"dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced"} Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.267106 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.276119 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.285542 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.293801 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.303144 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.312138 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.317393 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.317418 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.317428 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.317456 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.317466 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:36Z","lastTransitionTime":"2025-12-09T14:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.325680 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.333084 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-72msc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4f13f82-a44f-461d-93b0-916a7ed0b925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06c787dc46c305e7388f67f85601eb3e724c501d4da0504d4f8fd569f533c634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2spqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-72msc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.350702 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.360934 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.368796 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.379079 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.387964 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.396785 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.404131 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.416026 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.419096 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.419123 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.419132 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.419146 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.419155 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:36Z","lastTransitionTime":"2025-12-09T14:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.422627 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-72msc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4f13f82-a44f-461d-93b0-916a7ed0b925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06c787dc46c305e7388f67f85601eb3e724c501d4da0504d4f8fd569f533c634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2spqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-72msc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.437610 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.473922 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.512623 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.521357 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.521387 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.521398 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.521414 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.521425 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:36Z","lastTransitionTime":"2025-12-09T14:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.552799 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.593256 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.623771 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.623796 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.623807 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.623821 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.623831 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:36Z","lastTransitionTime":"2025-12-09T14:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.645577 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.678294 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.713579 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.726390 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.726423 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.726442 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.726470 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.726482 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:36Z","lastTransitionTime":"2025-12-09T14:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.755150 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.791806 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.828752 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.828790 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.828799 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.828815 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.828824 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:36Z","lastTransitionTime":"2025-12-09T14:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.837158 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.874174 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.913064 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:36Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.930808 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.930845 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.930854 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.930870 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:36 crc kubenswrapper[4555]: I1209 14:58:36.930880 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:36Z","lastTransitionTime":"2025-12-09T14:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.033183 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.033220 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.033230 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.033247 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.033257 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:37Z","lastTransitionTime":"2025-12-09T14:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.135595 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.135628 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.135639 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.135652 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.135661 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:37Z","lastTransitionTime":"2025-12-09T14:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.160175 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.160185 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.160185 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:37 crc kubenswrapper[4555]: E1209 14:58:37.160400 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:58:37 crc kubenswrapper[4555]: E1209 14:58:37.160490 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:58:37 crc kubenswrapper[4555]: E1209 14:58:37.160599 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.237823 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.237851 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.237859 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.237873 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.237882 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:37Z","lastTransitionTime":"2025-12-09T14:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.265338 4555 generic.go:334] "Generic (PLEG): container finished" podID="dd8f66c1-da34-466f-ba53-4848c96b1287" containerID="e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc" exitCode=0 Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.265405 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" event={"ID":"dd8f66c1-da34-466f-ba53-4848c96b1287","Type":"ContainerDied","Data":"e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc"} Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.269752 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" event={"ID":"c87d128c-39da-4880-9efc-7d2e6319a854","Type":"ContainerStarted","Data":"d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46"} Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.275649 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:37Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.286268 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:37Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.296359 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:37Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.305353 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:37Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.316035 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:37Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.329052 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:37Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.338997 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:37Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.340028 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.340099 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.340109 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.340129 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.340140 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:37Z","lastTransitionTime":"2025-12-09T14:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.353298 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:37Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.361807 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-72msc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4f13f82-a44f-461d-93b0-916a7ed0b925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06c787dc46c305e7388f67f85601eb3e724c501d4da0504d4f8fd569f533c634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2spqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-72msc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:37Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.375070 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:37Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.385283 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:37Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.395482 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:37Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.435727 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:37Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.442243 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.442273 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.442282 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.442296 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.442324 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:37Z","lastTransitionTime":"2025-12-09T14:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.475272 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:37Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.515106 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:37Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.545233 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.545265 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.545274 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.545288 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.545297 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:37Z","lastTransitionTime":"2025-12-09T14:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.647574 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.647836 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.647844 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.647855 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.647864 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:37Z","lastTransitionTime":"2025-12-09T14:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.750536 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.750593 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.750607 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.750632 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.750645 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:37Z","lastTransitionTime":"2025-12-09T14:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.852840 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.852863 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.852872 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.852883 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.852890 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:37Z","lastTransitionTime":"2025-12-09T14:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.955171 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.955214 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.955223 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.955243 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:37 crc kubenswrapper[4555]: I1209 14:58:37.955256 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:37Z","lastTransitionTime":"2025-12-09T14:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.057017 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.057052 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.057060 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.057074 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.057084 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:38Z","lastTransitionTime":"2025-12-09T14:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.158991 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.159065 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.159079 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.159096 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.159105 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:38Z","lastTransitionTime":"2025-12-09T14:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.260619 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.260647 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.260655 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.260668 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.260677 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:38Z","lastTransitionTime":"2025-12-09T14:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.274633 4555 generic.go:334] "Generic (PLEG): container finished" podID="dd8f66c1-da34-466f-ba53-4848c96b1287" containerID="d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392" exitCode=0 Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.274672 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" event={"ID":"dd8f66c1-da34-466f-ba53-4848c96b1287","Type":"ContainerDied","Data":"d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392"} Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.289791 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:38Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.299403 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:38Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.308826 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:38Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.319387 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:38Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.328007 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:38Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.338619 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:38Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.352995 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:38Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.362448 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.362485 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.362493 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.362507 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.362517 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:38Z","lastTransitionTime":"2025-12-09T14:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.365098 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-72msc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4f13f82-a44f-461d-93b0-916a7ed0b925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06c787dc46c305e7388f67f85601eb3e724c501d4da0504d4f8fd569f533c634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2spqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-72msc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:38Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.382183 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:38Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.394134 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:38Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.404027 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:38Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.415074 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:38Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.425864 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:38Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.436346 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:38Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.444235 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:38Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.464709 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.464748 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.464758 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.464775 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.464786 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:38Z","lastTransitionTime":"2025-12-09T14:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.566808 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.566847 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.566856 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.566871 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.566879 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:38Z","lastTransitionTime":"2025-12-09T14:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.668968 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.669009 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.669018 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.669035 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.669046 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:38Z","lastTransitionTime":"2025-12-09T14:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.771182 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.771219 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.771243 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.771256 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.771265 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:38Z","lastTransitionTime":"2025-12-09T14:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.873008 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.873047 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.873058 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.873074 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.873095 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:38Z","lastTransitionTime":"2025-12-09T14:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.975986 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.976202 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.976214 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.976232 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:38 crc kubenswrapper[4555]: I1209 14:58:38.976242 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:38Z","lastTransitionTime":"2025-12-09T14:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.078647 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.078677 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.078686 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.078700 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.078709 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:39Z","lastTransitionTime":"2025-12-09T14:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.160710 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.160763 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:58:39 crc kubenswrapper[4555]: E1209 14:58:39.160803 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:58:39 crc kubenswrapper[4555]: E1209 14:58:39.160853 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.160887 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:58:39 crc kubenswrapper[4555]: E1209 14:58:39.160933 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.181203 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.181498 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.181508 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.181526 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.181535 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:39Z","lastTransitionTime":"2025-12-09T14:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.280539 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" event={"ID":"c87d128c-39da-4880-9efc-7d2e6319a854","Type":"ContainerStarted","Data":"0e2ab2314ac7279678aa798c5a984672570725524350b32ad586d4ca8dc55197"} Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.280736 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.283236 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.283262 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.283269 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.283282 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.283291 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:39Z","lastTransitionTime":"2025-12-09T14:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.284106 4555 generic.go:334] "Generic (PLEG): container finished" podID="dd8f66c1-da34-466f-ba53-4848c96b1287" containerID="61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3" exitCode=0 Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.284131 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" event={"ID":"dd8f66c1-da34-466f-ba53-4848c96b1287","Type":"ContainerDied","Data":"61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3"} Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.291422 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.301942 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.302779 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.312555 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.322722 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.337825 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.351384 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2ab2314ac7279678aa798c5a984672570725524350b32ad586d4ca8dc55197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.359426 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-72msc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4f13f82-a44f-461d-93b0-916a7ed0b925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06c787dc46c305e7388f67f85601eb3e724c501d4da0504d4f8fd569f533c634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2spqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-72msc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.376278 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.385118 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.385155 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.385169 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.385187 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.385197 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:39Z","lastTransitionTime":"2025-12-09T14:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.387607 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.399078 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.409288 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.418793 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.429258 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.437327 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.449074 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.463592 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.473542 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.482214 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.486713 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.486822 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.486882 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.486964 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.487025 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:39Z","lastTransitionTime":"2025-12-09T14:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.492067 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.504154 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.521557 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2ab2314ac7279678aa798c5a984672570725524350b32ad586d4ca8dc55197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.529728 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-72msc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4f13f82-a44f-461d-93b0-916a7ed0b925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06c787dc46c305e7388f67f85601eb3e724c501d4da0504d4f8fd569f533c634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2spqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-72msc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.538027 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.547678 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.558108 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.567613 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.574786 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.584173 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.589185 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.589226 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.589237 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.589255 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.589267 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:39Z","lastTransitionTime":"2025-12-09T14:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.592464 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.603723 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:39Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.691779 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.691821 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.691830 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.691844 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.691852 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:39Z","lastTransitionTime":"2025-12-09T14:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.794778 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.794812 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.794821 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.794835 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.794844 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:39Z","lastTransitionTime":"2025-12-09T14:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.896356 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.896403 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.896413 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.896428 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.896441 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:39Z","lastTransitionTime":"2025-12-09T14:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.998328 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.998366 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.998374 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.998405 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:39 crc kubenswrapper[4555]: I1209 14:58:39.998415 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:39Z","lastTransitionTime":"2025-12-09T14:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.027625 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.027662 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.027672 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.027689 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.027698 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:40Z","lastTransitionTime":"2025-12-09T14:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:40 crc kubenswrapper[4555]: E1209 14:58:40.036055 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.038443 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.038475 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.038484 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.038499 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.038534 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:40Z","lastTransitionTime":"2025-12-09T14:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:40 crc kubenswrapper[4555]: E1209 14:58:40.046313 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.048624 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.048661 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.048671 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.048684 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.048694 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:40Z","lastTransitionTime":"2025-12-09T14:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.055599 4555 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 09 14:58:40 crc kubenswrapper[4555]: E1209 14:58:40.056177 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": read tcp 192.168.26.32:59004->192.168.26.32:6443: use of closed network connection" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.073284 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.073337 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.073348 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.073365 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.073376 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:40Z","lastTransitionTime":"2025-12-09T14:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:40 crc kubenswrapper[4555]: E1209 14:58:40.081831 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.085481 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.085512 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.085524 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.085538 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.085547 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:40Z","lastTransitionTime":"2025-12-09T14:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:40 crc kubenswrapper[4555]: E1209 14:58:40.094852 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: E1209 14:58:40.095111 4555 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.100263 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.100293 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.100315 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.100328 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.100338 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:40Z","lastTransitionTime":"2025-12-09T14:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.171039 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.183165 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.192870 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.200896 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.201663 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.201743 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.201810 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.201866 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.201929 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:40Z","lastTransitionTime":"2025-12-09T14:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.209980 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.219454 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.228086 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.237163 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.244897 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.254098 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.268011 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2ab2314ac7279678aa798c5a984672570725524350b32ad586d4ca8dc55197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.277545 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-72msc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4f13f82-a44f-461d-93b0-916a7ed0b925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06c787dc46c305e7388f67f85601eb3e724c501d4da0504d4f8fd569f533c634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2spqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-72msc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.289986 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" event={"ID":"dd8f66c1-da34-466f-ba53-4848c96b1287","Type":"ContainerStarted","Data":"1ae3cc97b46fa36f9bd6f439124d81700af58cb44671d46ffa2936260f6dda4e"} Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.290037 4555 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.290624 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.291801 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.301047 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.303726 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.303753 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.303762 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.303775 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.303785 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:40Z","lastTransitionTime":"2025-12-09T14:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.306501 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.309710 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.316891 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.325885 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.337353 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.346738 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.358269 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-72msc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4f13f82-a44f-461d-93b0-916a7ed0b925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06c787dc46c305e7388f67f85601eb3e724c501d4da0504d4f8fd569f533c634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2spqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-72msc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.372095 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.389210 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.398788 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.406677 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.406709 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.406718 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.406737 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.406747 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:40Z","lastTransitionTime":"2025-12-09T14:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.408422 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.424678 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.448859 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2ab2314ac7279678aa798c5a984672570725524350b32ad586d4ca8dc55197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.464923 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.485390 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.503936 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.508570 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.508643 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.508658 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.508687 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.508709 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:40Z","lastTransitionTime":"2025-12-09T14:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.518964 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ae3cc97b46fa36f9bd6f439124d81700af58cb44671d46ffa2936260f6dda4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:40Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.611816 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.611862 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.611874 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.611892 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.611904 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:40Z","lastTransitionTime":"2025-12-09T14:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.713921 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.713967 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.713977 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.713995 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.714005 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:40Z","lastTransitionTime":"2025-12-09T14:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.816519 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.816938 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.816951 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.816970 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.816986 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:40Z","lastTransitionTime":"2025-12-09T14:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.919559 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.919606 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.919616 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.919632 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:40 crc kubenswrapper[4555]: I1209 14:58:40.919645 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:40Z","lastTransitionTime":"2025-12-09T14:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.021683 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.021723 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.021733 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.021748 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.021757 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:41Z","lastTransitionTime":"2025-12-09T14:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.124034 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.124065 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.124075 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.124089 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.124101 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:41Z","lastTransitionTime":"2025-12-09T14:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.160657 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.160684 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.160667 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:58:41 crc kubenswrapper[4555]: E1209 14:58:41.160822 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:58:41 crc kubenswrapper[4555]: E1209 14:58:41.160907 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:58:41 crc kubenswrapper[4555]: E1209 14:58:41.161023 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.225916 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.225961 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.225972 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.225993 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.226007 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:41Z","lastTransitionTime":"2025-12-09T14:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.293398 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p4bl4_c87d128c-39da-4880-9efc-7d2e6319a854/ovnkube-controller/0.log" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.295391 4555 generic.go:334] "Generic (PLEG): container finished" podID="c87d128c-39da-4880-9efc-7d2e6319a854" containerID="0e2ab2314ac7279678aa798c5a984672570725524350b32ad586d4ca8dc55197" exitCode=1 Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.295734 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" event={"ID":"c87d128c-39da-4880-9efc-7d2e6319a854","Type":"ContainerDied","Data":"0e2ab2314ac7279678aa798c5a984672570725524350b32ad586d4ca8dc55197"} Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.296415 4555 scope.go:117] "RemoveContainer" containerID="0e2ab2314ac7279678aa798c5a984672570725524350b32ad586d4ca8dc55197" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.310793 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2ab2314ac7279678aa798c5a984672570725524350b32ad586d4ca8dc55197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2ab2314ac7279678aa798c5a984672570725524350b32ad586d4ca8dc55197\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"message\\\":\\\"14:58:40.957406 5819 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1209 14:58:40.957428 5819 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1209 14:58:40.957457 5819 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1209 14:58:40.957472 5819 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1209 14:58:40.959075 5819 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1209 14:58:40.959090 5819 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1209 14:58:40.959126 5819 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1209 14:58:40.959143 5819 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1209 14:58:40.959168 5819 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1209 14:58:40.959200 5819 handler.go:208] Removed *v1.Node event handler 2\\\\nI1209 14:58:40.959207 5819 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1209 14:58:40.959220 5819 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1209 14:58:40.959234 5819 handler.go:208] Removed *v1.Node event handler 7\\\\nI1209 14:58:40.959232 5819 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1209 14:58:40.959243 5819 factory.go:656] Stopping watch factory\\\\nI1209 14:58:40.959263 5819 ovnkube.go:599] Stopped ovnkube\\\\nI1209 14:58:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.317704 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-72msc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4f13f82-a44f-461d-93b0-916a7ed0b925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06c787dc46c305e7388f67f85601eb3e724c501d4da0504d4f8fd569f533c634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2spqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-72msc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.327724 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.327755 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.327766 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.327782 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.327792 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:41Z","lastTransitionTime":"2025-12-09T14:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.333454 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.344771 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.355320 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.366253 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.376405 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.387701 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.396873 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.405798 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.415481 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ae3cc97b46fa36f9bd6f439124d81700af58cb44671d46ffa2936260f6dda4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.422677 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.429774 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.429799 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.429808 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.429821 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.429830 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:41Z","lastTransitionTime":"2025-12-09T14:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.431850 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.442041 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.449730 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.532199 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.532236 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.532247 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.532263 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.532272 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:41Z","lastTransitionTime":"2025-12-09T14:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.634286 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.634344 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.634354 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.634380 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.634390 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:41Z","lastTransitionTime":"2025-12-09T14:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.736246 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.736286 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.736296 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.736325 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.736334 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:41Z","lastTransitionTime":"2025-12-09T14:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.838393 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.838432 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.838443 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.838457 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.838467 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:41Z","lastTransitionTime":"2025-12-09T14:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.940113 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.940148 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.940159 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.940173 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:41 crc kubenswrapper[4555]: I1209 14:58:41.940181 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:41Z","lastTransitionTime":"2025-12-09T14:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.042204 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.042236 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.042245 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.042258 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.042267 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:42Z","lastTransitionTime":"2025-12-09T14:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.144135 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.144168 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.144176 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.144189 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.144198 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:42Z","lastTransitionTime":"2025-12-09T14:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.246418 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.246454 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.246465 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.246479 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.246490 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:42Z","lastTransitionTime":"2025-12-09T14:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.299133 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p4bl4_c87d128c-39da-4880-9efc-7d2e6319a854/ovnkube-controller/1.log" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.299654 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p4bl4_c87d128c-39da-4880-9efc-7d2e6319a854/ovnkube-controller/0.log" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.301515 4555 generic.go:334] "Generic (PLEG): container finished" podID="c87d128c-39da-4880-9efc-7d2e6319a854" containerID="5901d1e7ebd3d979587ab0b065c05f639de1f07b2139edac2d4cf80c289503ff" exitCode=1 Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.301546 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" event={"ID":"c87d128c-39da-4880-9efc-7d2e6319a854","Type":"ContainerDied","Data":"5901d1e7ebd3d979587ab0b065c05f639de1f07b2139edac2d4cf80c289503ff"} Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.301581 4555 scope.go:117] "RemoveContainer" containerID="0e2ab2314ac7279678aa798c5a984672570725524350b32ad586d4ca8dc55197" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.302180 4555 scope.go:117] "RemoveContainer" containerID="5901d1e7ebd3d979587ab0b065c05f639de1f07b2139edac2d4cf80c289503ff" Dec 09 14:58:42 crc kubenswrapper[4555]: E1209 14:58:42.302404 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p4bl4_openshift-ovn-kubernetes(c87d128c-39da-4880-9efc-7d2e6319a854)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.312542 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ae3cc97b46fa36f9bd6f439124d81700af58cb44671d46ffa2936260f6dda4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:42Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.320561 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:42Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.329661 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:42Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.337227 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:42Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.344608 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:42Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.347939 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.347966 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.347974 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.347986 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.347994 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:42Z","lastTransitionTime":"2025-12-09T14:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.353661 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:42Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.362046 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:42Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.370643 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:42Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.378377 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:42Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.386438 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:42Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.398213 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5901d1e7ebd3d979587ab0b065c05f639de1f07b2139edac2d4cf80c289503ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2ab2314ac7279678aa798c5a984672570725524350b32ad586d4ca8dc55197\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"message\\\":\\\"14:58:40.957406 5819 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1209 14:58:40.957428 5819 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1209 14:58:40.957457 5819 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1209 14:58:40.957472 5819 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1209 14:58:40.959075 5819 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1209 14:58:40.959090 5819 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1209 14:58:40.959126 5819 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1209 14:58:40.959143 5819 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1209 14:58:40.959168 5819 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1209 14:58:40.959200 5819 handler.go:208] Removed *v1.Node event handler 2\\\\nI1209 14:58:40.959207 5819 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1209 14:58:40.959220 5819 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1209 14:58:40.959234 5819 handler.go:208] Removed *v1.Node event handler 7\\\\nI1209 14:58:40.959232 5819 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1209 14:58:40.959243 5819 factory.go:656] Stopping watch factory\\\\nI1209 14:58:40.959263 5819 ovnkube.go:599] Stopped ovnkube\\\\nI1209 14:58:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5901d1e7ebd3d979587ab0b065c05f639de1f07b2139edac2d4cf80c289503ff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"1.918808 5946 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-p4bl4 after 0 failed attempt(s)\\\\nI1209 14:58:41.919392 5946 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-p4bl4\\\\nI1209 14:58:41.918113 5946 services_controller.go:434] Service openshift-cluster-version/cluster-version-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{cluster-version-operator openshift-cluster-version ddf4933a-f532-4906-9b8f-3b15aa433264 6187 0 2025-02-23 05:11:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:cluster-version-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true kubernetes.io/description:Expose cluster-version operator metrics to other in-cluster consumers. Access requires a prometheus-k8s RoleBinding in this namespace. service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:cluster-version-operator-serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0006c6a07 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Nam\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:42Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.405070 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-72msc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4f13f82-a44f-461d-93b0-916a7ed0b925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06c787dc46c305e7388f67f85601eb3e724c501d4da0504d4f8fd569f533c634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2spqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-72msc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:42Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.421472 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:42Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.429858 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:42Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.437359 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:42Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.449662 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.449715 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.449726 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.449739 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.449748 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:42Z","lastTransitionTime":"2025-12-09T14:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.551511 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.551539 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.551549 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.551575 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.551585 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:42Z","lastTransitionTime":"2025-12-09T14:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.653033 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.653059 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.653068 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.653077 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.653085 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:42Z","lastTransitionTime":"2025-12-09T14:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.754580 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.754608 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.754616 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.754624 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.754633 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:42Z","lastTransitionTime":"2025-12-09T14:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.856169 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.856196 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.856205 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.856214 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.856221 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:42Z","lastTransitionTime":"2025-12-09T14:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.958394 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.958418 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.958427 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.958436 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.958444 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:42Z","lastTransitionTime":"2025-12-09T14:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.976819 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.976883 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:42 crc kubenswrapper[4555]: E1209 14:58:42.976921 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 14:58:58.976898598 +0000 UTC m=+49.113611316 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 14:58:42 crc kubenswrapper[4555]: E1209 14:58:42.976941 4555 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:58:42 crc kubenswrapper[4555]: E1209 14:58:42.976974 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 14:58:58.97696387 +0000 UTC m=+49.113676588 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:58:42 crc kubenswrapper[4555]: I1209 14:58:42.976974 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:42 crc kubenswrapper[4555]: E1209 14:58:42.977087 4555 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:58:42 crc kubenswrapper[4555]: E1209 14:58:42.977123 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 14:58:58.977115224 +0000 UTC m=+49.113827942 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.060251 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.060276 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.060284 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.060293 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.060315 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:43Z","lastTransitionTime":"2025-12-09T14:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.078013 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.078043 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:58:43 crc kubenswrapper[4555]: E1209 14:58:43.078126 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:58:43 crc kubenswrapper[4555]: E1209 14:58:43.078142 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:58:43 crc kubenswrapper[4555]: E1209 14:58:43.078145 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:58:43 crc kubenswrapper[4555]: E1209 14:58:43.078183 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:58:43 crc kubenswrapper[4555]: E1209 14:58:43.078197 4555 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:43 crc kubenswrapper[4555]: E1209 14:58:43.078239 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 14:58:59.078226113 +0000 UTC m=+49.214938831 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:43 crc kubenswrapper[4555]: E1209 14:58:43.078152 4555 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:43 crc kubenswrapper[4555]: E1209 14:58:43.078292 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 14:58:59.07828328 +0000 UTC m=+49.214995998 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.162472 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.162493 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:43 crc kubenswrapper[4555]: E1209 14:58:43.162566 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.162470 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:58:43 crc kubenswrapper[4555]: E1209 14:58:43.162667 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:58:43 crc kubenswrapper[4555]: E1209 14:58:43.162785 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.163384 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.163412 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.163421 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.163431 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.163438 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:43Z","lastTransitionTime":"2025-12-09T14:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.264924 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.264951 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.264960 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.264968 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.264975 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:43Z","lastTransitionTime":"2025-12-09T14:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.305210 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p4bl4_c87d128c-39da-4880-9efc-7d2e6319a854/ovnkube-controller/1.log" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.307432 4555 scope.go:117] "RemoveContainer" containerID="5901d1e7ebd3d979587ab0b065c05f639de1f07b2139edac2d4cf80c289503ff" Dec 09 14:58:43 crc kubenswrapper[4555]: E1209 14:58:43.307550 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p4bl4_openshift-ovn-kubernetes(c87d128c-39da-4880-9efc-7d2e6319a854)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.320820 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:43Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.328795 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:43Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.336158 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:43Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.343807 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:43Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.351629 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:43Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.363476 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5901d1e7ebd3d979587ab0b065c05f639de1f07b2139edac2d4cf80c289503ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5901d1e7ebd3d979587ab0b065c05f639de1f07b2139edac2d4cf80c289503ff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"1.918808 5946 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-p4bl4 after 0 failed attempt(s)\\\\nI1209 14:58:41.919392 5946 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-p4bl4\\\\nI1209 14:58:41.918113 5946 services_controller.go:434] Service openshift-cluster-version/cluster-version-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{cluster-version-operator openshift-cluster-version ddf4933a-f532-4906-9b8f-3b15aa433264 6187 0 2025-02-23 05:11:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:cluster-version-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true kubernetes.io/description:Expose cluster-version operator metrics to other in-cluster consumers. Access requires a prometheus-k8s RoleBinding in this namespace. service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:cluster-version-operator-serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0006c6a07 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Nam\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p4bl4_openshift-ovn-kubernetes(c87d128c-39da-4880-9efc-7d2e6319a854)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:43Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.365996 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.366020 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.366029 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.366039 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.366046 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:43Z","lastTransitionTime":"2025-12-09T14:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.370595 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-72msc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4f13f82-a44f-461d-93b0-916a7ed0b925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06c787dc46c305e7388f67f85601eb3e724c501d4da0504d4f8fd569f533c634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2spqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-72msc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:43Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.378144 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:43Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.386048 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:43Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.393029 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:43Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.401684 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ae3cc97b46fa36f9bd6f439124d81700af58cb44671d46ffa2936260f6dda4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:43Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.408473 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:43Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.416726 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:43Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.424544 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:43Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.431970 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:43Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.467073 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.467092 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.467100 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.467110 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.467117 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:43Z","lastTransitionTime":"2025-12-09T14:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.568548 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.568573 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.568583 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.568593 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.568601 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:43Z","lastTransitionTime":"2025-12-09T14:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.669918 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.670193 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.670276 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.670387 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.670446 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:43Z","lastTransitionTime":"2025-12-09T14:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.772514 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.772545 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.772553 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.772568 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.772577 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:43Z","lastTransitionTime":"2025-12-09T14:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.874067 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.874189 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.874285 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.874440 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.874499 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:43Z","lastTransitionTime":"2025-12-09T14:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.975966 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.976002 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.976011 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.976023 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:43 crc kubenswrapper[4555]: I1209 14:58:43.976031 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:43Z","lastTransitionTime":"2025-12-09T14:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.078268 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.078324 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.078346 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.078357 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.078365 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:44Z","lastTransitionTime":"2025-12-09T14:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.180484 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.180515 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.180524 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.180534 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.180544 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:44Z","lastTransitionTime":"2025-12-09T14:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.282373 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.282406 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.282417 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.282428 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.282437 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:44Z","lastTransitionTime":"2025-12-09T14:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.383846 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.383955 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.384020 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.384082 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.384142 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:44Z","lastTransitionTime":"2025-12-09T14:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.485877 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.485912 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.485924 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.485936 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.485944 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:44Z","lastTransitionTime":"2025-12-09T14:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.588158 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.588255 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.588365 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.588442 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.588498 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:44Z","lastTransitionTime":"2025-12-09T14:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.690019 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.690264 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.690363 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.690438 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.690505 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:44Z","lastTransitionTime":"2025-12-09T14:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.791903 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.792008 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.792081 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.792139 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.792196 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:44Z","lastTransitionTime":"2025-12-09T14:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.893477 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.893503 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.893511 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.893522 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.893530 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:44Z","lastTransitionTime":"2025-12-09T14:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.994501 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.994525 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.994534 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.994543 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:44 crc kubenswrapper[4555]: I1209 14:58:44.994550 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:44Z","lastTransitionTime":"2025-12-09T14:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.000684 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6"] Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.001027 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.002407 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.002568 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.010556 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.019004 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.026638 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.036181 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ae3cc97b46fa36f9bd6f439124d81700af58cb44671d46ffa2936260f6dda4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.043476 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae951ddc-0e90-497c-af85-8e39088f32d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qchk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.050146 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.061647 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.073184 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.080617 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.086825 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-72msc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4f13f82-a44f-461d-93b0-916a7ed0b925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06c787dc46c305e7388f67f85601eb3e724c501d4da0504d4f8fd569f533c634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2spqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-72msc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.093633 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwhfz\" (UniqueName: \"kubernetes.io/projected/ae951ddc-0e90-497c-af85-8e39088f32d9-kube-api-access-zwhfz\") pod \"ovnkube-control-plane-749d76644c-qchk6\" (UID: \"ae951ddc-0e90-497c-af85-8e39088f32d9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.093668 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ae951ddc-0e90-497c-af85-8e39088f32d9-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qchk6\" (UID: \"ae951ddc-0e90-497c-af85-8e39088f32d9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.093685 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ae951ddc-0e90-497c-af85-8e39088f32d9-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qchk6\" (UID: \"ae951ddc-0e90-497c-af85-8e39088f32d9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.093746 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ae951ddc-0e90-497c-af85-8e39088f32d9-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qchk6\" (UID: \"ae951ddc-0e90-497c-af85-8e39088f32d9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.096744 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.096767 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.096775 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.096786 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.096794 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:45Z","lastTransitionTime":"2025-12-09T14:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.100215 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.108526 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.116293 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.123393 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.131343 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.142839 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5901d1e7ebd3d979587ab0b065c05f639de1f07b2139edac2d4cf80c289503ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5901d1e7ebd3d979587ab0b065c05f639de1f07b2139edac2d4cf80c289503ff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"1.918808 5946 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-p4bl4 after 0 failed attempt(s)\\\\nI1209 14:58:41.919392 5946 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-p4bl4\\\\nI1209 14:58:41.918113 5946 services_controller.go:434] Service openshift-cluster-version/cluster-version-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{cluster-version-operator openshift-cluster-version ddf4933a-f532-4906-9b8f-3b15aa433264 6187 0 2025-02-23 05:11:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:cluster-version-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true kubernetes.io/description:Expose cluster-version operator metrics to other in-cluster consumers. Access requires a prometheus-k8s RoleBinding in this namespace. service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:cluster-version-operator-serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0006c6a07 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Nam\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p4bl4_openshift-ovn-kubernetes(c87d128c-39da-4880-9efc-7d2e6319a854)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.160991 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.161021 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:58:45 crc kubenswrapper[4555]: E1209 14:58:45.161063 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:58:45 crc kubenswrapper[4555]: E1209 14:58:45.161109 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.161027 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:45 crc kubenswrapper[4555]: E1209 14:58:45.161197 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.194265 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwhfz\" (UniqueName: \"kubernetes.io/projected/ae951ddc-0e90-497c-af85-8e39088f32d9-kube-api-access-zwhfz\") pod \"ovnkube-control-plane-749d76644c-qchk6\" (UID: \"ae951ddc-0e90-497c-af85-8e39088f32d9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.194325 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ae951ddc-0e90-497c-af85-8e39088f32d9-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qchk6\" (UID: \"ae951ddc-0e90-497c-af85-8e39088f32d9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.194358 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ae951ddc-0e90-497c-af85-8e39088f32d9-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qchk6\" (UID: \"ae951ddc-0e90-497c-af85-8e39088f32d9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.194404 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ae951ddc-0e90-497c-af85-8e39088f32d9-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qchk6\" (UID: \"ae951ddc-0e90-497c-af85-8e39088f32d9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.194886 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ae951ddc-0e90-497c-af85-8e39088f32d9-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qchk6\" (UID: \"ae951ddc-0e90-497c-af85-8e39088f32d9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.194939 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ae951ddc-0e90-497c-af85-8e39088f32d9-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qchk6\" (UID: \"ae951ddc-0e90-497c-af85-8e39088f32d9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.198539 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ae951ddc-0e90-497c-af85-8e39088f32d9-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qchk6\" (UID: \"ae951ddc-0e90-497c-af85-8e39088f32d9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.199215 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.199297 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.199388 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.199446 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.199494 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:45Z","lastTransitionTime":"2025-12-09T14:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.206566 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwhfz\" (UniqueName: \"kubernetes.io/projected/ae951ddc-0e90-497c-af85-8e39088f32d9-kube-api-access-zwhfz\") pod \"ovnkube-control-plane-749d76644c-qchk6\" (UID: \"ae951ddc-0e90-497c-af85-8e39088f32d9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.300714 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.300742 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.300751 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.300762 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.300770 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:45Z","lastTransitionTime":"2025-12-09T14:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.309236 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" Dec 09 14:58:45 crc kubenswrapper[4555]: W1209 14:58:45.318274 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae951ddc_0e90_497c_af85_8e39088f32d9.slice/crio-e2673b0482944d28b8404307b2b349fb8f70b961b0afa24924cc282b49a87794 WatchSource:0}: Error finding container e2673b0482944d28b8404307b2b349fb8f70b961b0afa24924cc282b49a87794: Status 404 returned error can't find the container with id e2673b0482944d28b8404307b2b349fb8f70b961b0afa24924cc282b49a87794 Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.403030 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.403068 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.403078 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.403092 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.403101 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:45Z","lastTransitionTime":"2025-12-09T14:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.505036 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.505068 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.505076 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.505089 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.505098 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:45Z","lastTransitionTime":"2025-12-09T14:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.607033 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.607066 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.607074 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.607086 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.607094 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:45Z","lastTransitionTime":"2025-12-09T14:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.708606 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.708641 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.708650 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.708664 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.708674 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:45Z","lastTransitionTime":"2025-12-09T14:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.810635 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.810682 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.810692 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.810707 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.810716 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:45Z","lastTransitionTime":"2025-12-09T14:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.912942 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.913008 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.913018 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.913040 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:45 crc kubenswrapper[4555]: I1209 14:58:45.913056 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:45Z","lastTransitionTime":"2025-12-09T14:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.014962 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.015001 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.015010 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.015025 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.015035 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:46Z","lastTransitionTime":"2025-12-09T14:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.117060 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.117091 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.117101 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.117115 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.117122 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:46Z","lastTransitionTime":"2025-12-09T14:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.218730 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.218786 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.218797 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.218817 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.218837 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:46Z","lastTransitionTime":"2025-12-09T14:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.314011 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" event={"ID":"ae951ddc-0e90-497c-af85-8e39088f32d9","Type":"ContainerStarted","Data":"e1a5e10f192fc24405b980ce611a09f1e0c6d49c4c94df89bceef151a747f1aa"} Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.314057 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" event={"ID":"ae951ddc-0e90-497c-af85-8e39088f32d9","Type":"ContainerStarted","Data":"59535f0584581ae9d42fa3c1d9f4f204c8a2c7c96d211a2b8d74806eb3901a33"} Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.314069 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" event={"ID":"ae951ddc-0e90-497c-af85-8e39088f32d9","Type":"ContainerStarted","Data":"e2673b0482944d28b8404307b2b349fb8f70b961b0afa24924cc282b49a87794"} Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.320363 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.320398 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.320409 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.320421 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.320430 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:46Z","lastTransitionTime":"2025-12-09T14:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.324139 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae951ddc-0e90-497c-af85-8e39088f32d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59535f0584581ae9d42fa3c1d9f4f204c8a2c7c96d211a2b8d74806eb3901a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a5e10f192fc24405b980ce611a09f1e0c6d49c4c94df89bceef151a747f1aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qchk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.332503 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.345392 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.354956 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.364511 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ae3cc97b46fa36f9bd6f439124d81700af58cb44671d46ffa2936260f6dda4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.372154 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.380565 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.388243 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-6sc2q"] Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.388226 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.388753 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:58:46 crc kubenswrapper[4555]: E1209 14:58:46.388818 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.397596 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.406085 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.420680 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5901d1e7ebd3d979587ab0b065c05f639de1f07b2139edac2d4cf80c289503ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5901d1e7ebd3d979587ab0b065c05f639de1f07b2139edac2d4cf80c289503ff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"1.918808 5946 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-p4bl4 after 0 failed attempt(s)\\\\nI1209 14:58:41.919392 5946 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-p4bl4\\\\nI1209 14:58:41.918113 5946 services_controller.go:434] Service openshift-cluster-version/cluster-version-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{cluster-version-operator openshift-cluster-version ddf4933a-f532-4906-9b8f-3b15aa433264 6187 0 2025-02-23 05:11:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:cluster-version-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true kubernetes.io/description:Expose cluster-version operator metrics to other in-cluster consumers. Access requires a prometheus-k8s RoleBinding in this namespace. service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:cluster-version-operator-serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0006c6a07 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Nam\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p4bl4_openshift-ovn-kubernetes(c87d128c-39da-4880-9efc-7d2e6319a854)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.422033 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.422060 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.422071 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.422083 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.422092 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:46Z","lastTransitionTime":"2025-12-09T14:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.428525 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-72msc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4f13f82-a44f-461d-93b0-916a7ed0b925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06c787dc46c305e7388f67f85601eb3e724c501d4da0504d4f8fd569f533c634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2spqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-72msc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.442132 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.450567 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.458207 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.465315 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.473472 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.481085 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.488407 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.494813 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sc2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f7e7bd-b667-42fa-b040-4a186e4570e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sc2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.501767 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.503886 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc8nv\" (UniqueName: \"kubernetes.io/projected/35f7e7bd-b667-42fa-b040-4a186e4570e3-kube-api-access-mc8nv\") pod \"network-metrics-daemon-6sc2q\" (UID: \"35f7e7bd-b667-42fa-b040-4a186e4570e3\") " pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.503920 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/35f7e7bd-b667-42fa-b040-4a186e4570e3-metrics-certs\") pod \"network-metrics-daemon-6sc2q\" (UID: \"35f7e7bd-b667-42fa-b040-4a186e4570e3\") " pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.510027 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.524103 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.524207 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.524222 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.524249 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.524270 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:46Z","lastTransitionTime":"2025-12-09T14:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.528396 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5901d1e7ebd3d979587ab0b065c05f639de1f07b2139edac2d4cf80c289503ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5901d1e7ebd3d979587ab0b065c05f639de1f07b2139edac2d4cf80c289503ff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"1.918808 5946 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-p4bl4 after 0 failed attempt(s)\\\\nI1209 14:58:41.919392 5946 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-p4bl4\\\\nI1209 14:58:41.918113 5946 services_controller.go:434] Service openshift-cluster-version/cluster-version-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{cluster-version-operator openshift-cluster-version ddf4933a-f532-4906-9b8f-3b15aa433264 6187 0 2025-02-23 05:11:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:cluster-version-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true kubernetes.io/description:Expose cluster-version operator metrics to other in-cluster consumers. Access requires a prometheus-k8s RoleBinding in this namespace. service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:cluster-version-operator-serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0006c6a07 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Nam\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p4bl4_openshift-ovn-kubernetes(c87d128c-39da-4880-9efc-7d2e6319a854)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.538037 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-72msc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4f13f82-a44f-461d-93b0-916a7ed0b925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06c787dc46c305e7388f67f85601eb3e724c501d4da0504d4f8fd569f533c634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2spqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-72msc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.552954 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.563468 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.572576 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.582399 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ae3cc97b46fa36f9bd6f439124d81700af58cb44671d46ffa2936260f6dda4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.591002 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae951ddc-0e90-497c-af85-8e39088f32d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59535f0584581ae9d42fa3c1d9f4f204c8a2c7c96d211a2b8d74806eb3901a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a5e10f192fc24405b980ce611a09f1e0c6d49c4c94df89bceef151a747f1aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qchk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.601283 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.604346 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc8nv\" (UniqueName: \"kubernetes.io/projected/35f7e7bd-b667-42fa-b040-4a186e4570e3-kube-api-access-mc8nv\") pod \"network-metrics-daemon-6sc2q\" (UID: \"35f7e7bd-b667-42fa-b040-4a186e4570e3\") " pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.604380 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/35f7e7bd-b667-42fa-b040-4a186e4570e3-metrics-certs\") pod \"network-metrics-daemon-6sc2q\" (UID: \"35f7e7bd-b667-42fa-b040-4a186e4570e3\") " pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:58:46 crc kubenswrapper[4555]: E1209 14:58:46.604480 4555 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:58:46 crc kubenswrapper[4555]: E1209 14:58:46.604529 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35f7e7bd-b667-42fa-b040-4a186e4570e3-metrics-certs podName:35f7e7bd-b667-42fa-b040-4a186e4570e3 nodeName:}" failed. No retries permitted until 2025-12-09 14:58:47.104517028 +0000 UTC m=+37.241229746 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/35f7e7bd-b667-42fa-b040-4a186e4570e3-metrics-certs") pod "network-metrics-daemon-6sc2q" (UID: "35f7e7bd-b667-42fa-b040-4a186e4570e3") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.609947 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.618464 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.619772 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc8nv\" (UniqueName: \"kubernetes.io/projected/35f7e7bd-b667-42fa-b040-4a186e4570e3-kube-api-access-mc8nv\") pod \"network-metrics-daemon-6sc2q\" (UID: \"35f7e7bd-b667-42fa-b040-4a186e4570e3\") " pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.626660 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.626822 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.626862 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.626873 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.626888 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.626899 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:46Z","lastTransitionTime":"2025-12-09T14:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.728546 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.728576 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.728585 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.728599 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.728610 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:46Z","lastTransitionTime":"2025-12-09T14:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.830738 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.830776 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.830789 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.830805 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.830815 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:46Z","lastTransitionTime":"2025-12-09T14:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.932703 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.932737 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.932746 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.932758 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:46 crc kubenswrapper[4555]: I1209 14:58:46.932768 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:46Z","lastTransitionTime":"2025-12-09T14:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.034877 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.034911 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.034921 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.034934 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.034943 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:47Z","lastTransitionTime":"2025-12-09T14:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.109813 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/35f7e7bd-b667-42fa-b040-4a186e4570e3-metrics-certs\") pod \"network-metrics-daemon-6sc2q\" (UID: \"35f7e7bd-b667-42fa-b040-4a186e4570e3\") " pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:58:47 crc kubenswrapper[4555]: E1209 14:58:47.109912 4555 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:58:47 crc kubenswrapper[4555]: E1209 14:58:47.109968 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35f7e7bd-b667-42fa-b040-4a186e4570e3-metrics-certs podName:35f7e7bd-b667-42fa-b040-4a186e4570e3 nodeName:}" failed. No retries permitted until 2025-12-09 14:58:48.10995444 +0000 UTC m=+38.246667158 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/35f7e7bd-b667-42fa-b040-4a186e4570e3-metrics-certs") pod "network-metrics-daemon-6sc2q" (UID: "35f7e7bd-b667-42fa-b040-4a186e4570e3") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.136728 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.136778 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.136789 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.136804 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.136818 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:47Z","lastTransitionTime":"2025-12-09T14:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.161100 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.161123 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.161134 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:58:47 crc kubenswrapper[4555]: E1209 14:58:47.161224 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:58:47 crc kubenswrapper[4555]: E1209 14:58:47.161292 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:58:47 crc kubenswrapper[4555]: E1209 14:58:47.161401 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.238580 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.238608 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.238617 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.238629 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.238638 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:47Z","lastTransitionTime":"2025-12-09T14:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.341530 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.341566 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.341577 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.341591 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.341599 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:47Z","lastTransitionTime":"2025-12-09T14:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.443511 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.443545 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.443555 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.443568 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.443581 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:47Z","lastTransitionTime":"2025-12-09T14:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.545779 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.545806 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.545815 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.545826 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.545835 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:47Z","lastTransitionTime":"2025-12-09T14:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.647926 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.647956 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.647964 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.647974 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.647982 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:47Z","lastTransitionTime":"2025-12-09T14:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.750105 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.750161 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.750172 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.750196 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.750209 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:47Z","lastTransitionTime":"2025-12-09T14:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.853502 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.853539 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.853550 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.853563 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.853572 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:47Z","lastTransitionTime":"2025-12-09T14:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.955370 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.955397 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.955409 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.955421 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:47 crc kubenswrapper[4555]: I1209 14:58:47.955430 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:47Z","lastTransitionTime":"2025-12-09T14:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.056983 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.057008 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.057016 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.057027 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.057034 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:48Z","lastTransitionTime":"2025-12-09T14:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.118081 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/35f7e7bd-b667-42fa-b040-4a186e4570e3-metrics-certs\") pod \"network-metrics-daemon-6sc2q\" (UID: \"35f7e7bd-b667-42fa-b040-4a186e4570e3\") " pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:58:48 crc kubenswrapper[4555]: E1209 14:58:48.118202 4555 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:58:48 crc kubenswrapper[4555]: E1209 14:58:48.118248 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35f7e7bd-b667-42fa-b040-4a186e4570e3-metrics-certs podName:35f7e7bd-b667-42fa-b040-4a186e4570e3 nodeName:}" failed. No retries permitted until 2025-12-09 14:58:50.118234533 +0000 UTC m=+40.254947250 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/35f7e7bd-b667-42fa-b040-4a186e4570e3-metrics-certs") pod "network-metrics-daemon-6sc2q" (UID: "35f7e7bd-b667-42fa-b040-4a186e4570e3") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.158330 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.158351 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.158361 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.158370 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.158382 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:48Z","lastTransitionTime":"2025-12-09T14:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.160765 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:58:48 crc kubenswrapper[4555]: E1209 14:58:48.160871 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.259710 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.259741 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.259751 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.259763 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.259772 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:48Z","lastTransitionTime":"2025-12-09T14:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.361738 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.361775 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.361786 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.361801 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.361812 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:48Z","lastTransitionTime":"2025-12-09T14:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.463798 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.463831 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.463840 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.463851 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.463859 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:48Z","lastTransitionTime":"2025-12-09T14:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.566226 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.566255 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.566274 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.566287 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.566295 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:48Z","lastTransitionTime":"2025-12-09T14:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.668018 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.668043 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.668051 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.668062 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.668086 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:48Z","lastTransitionTime":"2025-12-09T14:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.769729 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.769755 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.769763 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.769772 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.769780 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:48Z","lastTransitionTime":"2025-12-09T14:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.871372 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.871413 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.871422 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.871431 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.871437 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:48Z","lastTransitionTime":"2025-12-09T14:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.972885 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.972913 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.972922 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.972932 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:48 crc kubenswrapper[4555]: I1209 14:58:48.972940 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:48Z","lastTransitionTime":"2025-12-09T14:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.074280 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.074397 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.074459 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.074544 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.074604 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:49Z","lastTransitionTime":"2025-12-09T14:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.160546 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:58:49 crc kubenswrapper[4555]: E1209 14:58:49.160715 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.160570 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:49 crc kubenswrapper[4555]: E1209 14:58:49.160901 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.160546 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:58:49 crc kubenswrapper[4555]: E1209 14:58:49.161058 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.176127 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.176151 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.176160 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.176170 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.176180 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:49Z","lastTransitionTime":"2025-12-09T14:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.278021 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.278045 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.278053 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.278063 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.278070 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:49Z","lastTransitionTime":"2025-12-09T14:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.379524 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.379560 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.379569 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.379582 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.379590 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:49Z","lastTransitionTime":"2025-12-09T14:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.481392 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.481425 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.481432 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.481445 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.481454 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:49Z","lastTransitionTime":"2025-12-09T14:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.583781 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.583807 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.583816 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.583827 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.583834 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:49Z","lastTransitionTime":"2025-12-09T14:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.687047 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.687105 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.687115 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.687130 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.687143 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:49Z","lastTransitionTime":"2025-12-09T14:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.790038 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.790080 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.790089 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.790102 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.790109 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:49Z","lastTransitionTime":"2025-12-09T14:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.891825 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.891857 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.891867 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.891878 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.891886 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:49Z","lastTransitionTime":"2025-12-09T14:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.994115 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.994133 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.994141 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.994150 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:49 crc kubenswrapper[4555]: I1209 14:58:49.994157 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:49Z","lastTransitionTime":"2025-12-09T14:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.096291 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.096336 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.096346 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.096357 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.096366 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:50Z","lastTransitionTime":"2025-12-09T14:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.134392 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/35f7e7bd-b667-42fa-b040-4a186e4570e3-metrics-certs\") pod \"network-metrics-daemon-6sc2q\" (UID: \"35f7e7bd-b667-42fa-b040-4a186e4570e3\") " pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:58:50 crc kubenswrapper[4555]: E1209 14:58:50.134578 4555 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:58:50 crc kubenswrapper[4555]: E1209 14:58:50.134663 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35f7e7bd-b667-42fa-b040-4a186e4570e3-metrics-certs podName:35f7e7bd-b667-42fa-b040-4a186e4570e3 nodeName:}" failed. No retries permitted until 2025-12-09 14:58:54.134638369 +0000 UTC m=+44.271351097 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/35f7e7bd-b667-42fa-b040-4a186e4570e3-metrics-certs") pod "network-metrics-daemon-6sc2q" (UID: "35f7e7bd-b667-42fa-b040-4a186e4570e3") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.169486 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:58:50 crc kubenswrapper[4555]: E1209 14:58:50.169628 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.185650 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:50Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.195979 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.196011 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.196019 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.196034 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.196043 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:50Z","lastTransitionTime":"2025-12-09T14:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.199594 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:50Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:50 crc kubenswrapper[4555]: E1209 14:58:50.204619 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:50Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.206710 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.206740 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.206749 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.206761 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.206770 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:50Z","lastTransitionTime":"2025-12-09T14:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.207944 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:50Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.215159 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sc2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f7e7bd-b667-42fa-b040-4a186e4570e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sc2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:50Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:50 crc kubenswrapper[4555]: E1209 14:58:50.217257 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:50Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.219465 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.219492 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.219502 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.219514 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.219523 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:50Z","lastTransitionTime":"2025-12-09T14:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.222427 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-72msc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4f13f82-a44f-461d-93b0-916a7ed0b925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06c787dc46c305e7388f67f85601eb3e724c501d4da0504d4f8fd569f533c634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2spqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-72msc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:50Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:50 crc kubenswrapper[4555]: E1209 14:58:50.226794 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:50Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.228745 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.228774 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.228783 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.228797 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.228806 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:50Z","lastTransitionTime":"2025-12-09T14:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.236586 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:50Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:50 crc kubenswrapper[4555]: E1209 14:58:50.239068 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:50Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.241196 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.241221 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.241232 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.241255 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.241265 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:50Z","lastTransitionTime":"2025-12-09T14:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.245049 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:50Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:50 crc kubenswrapper[4555]: E1209 14:58:50.249237 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:50Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:50 crc kubenswrapper[4555]: E1209 14:58:50.249387 4555 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.250462 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.250496 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.250507 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.250521 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.250532 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:50Z","lastTransitionTime":"2025-12-09T14:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.254442 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:50Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.261901 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:50Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.269944 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:50Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.281802 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5901d1e7ebd3d979587ab0b065c05f639de1f07b2139edac2d4cf80c289503ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5901d1e7ebd3d979587ab0b065c05f639de1f07b2139edac2d4cf80c289503ff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"1.918808 5946 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-p4bl4 after 0 failed attempt(s)\\\\nI1209 14:58:41.919392 5946 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-p4bl4\\\\nI1209 14:58:41.918113 5946 services_controller.go:434] Service openshift-cluster-version/cluster-version-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{cluster-version-operator openshift-cluster-version ddf4933a-f532-4906-9b8f-3b15aa433264 6187 0 2025-02-23 05:11:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:cluster-version-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true kubernetes.io/description:Expose cluster-version operator metrics to other in-cluster consumers. Access requires a prometheus-k8s RoleBinding in this namespace. service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:cluster-version-operator-serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0006c6a07 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Nam\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p4bl4_openshift-ovn-kubernetes(c87d128c-39da-4880-9efc-7d2e6319a854)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:50Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.289926 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:50Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.297821 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:50Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.305669 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:50Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.315363 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ae3cc97b46fa36f9bd6f439124d81700af58cb44671d46ffa2936260f6dda4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:50Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.322573 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae951ddc-0e90-497c-af85-8e39088f32d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59535f0584581ae9d42fa3c1d9f4f204c8a2c7c96d211a2b8d74806eb3901a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a5e10f192fc24405b980ce611a09f1e0c6d49c4c94df89bceef151a747f1aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qchk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:50Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.329158 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:50Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.353020 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.353052 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.353064 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.353080 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.353090 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:50Z","lastTransitionTime":"2025-12-09T14:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.454389 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.454597 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.454690 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.454777 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.454842 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:50Z","lastTransitionTime":"2025-12-09T14:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.556197 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.556223 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.556233 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.556257 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.556265 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:50Z","lastTransitionTime":"2025-12-09T14:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.658329 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.658357 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.658367 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.658378 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.658386 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:50Z","lastTransitionTime":"2025-12-09T14:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.759765 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.759793 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.759803 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.759814 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.759822 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:50Z","lastTransitionTime":"2025-12-09T14:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.861556 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.861585 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.861593 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.861604 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.861612 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:50Z","lastTransitionTime":"2025-12-09T14:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.963496 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.963525 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.963534 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.963544 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:50 crc kubenswrapper[4555]: I1209 14:58:50.963551 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:50Z","lastTransitionTime":"2025-12-09T14:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.065532 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.065577 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.065587 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.065598 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.065605 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:51Z","lastTransitionTime":"2025-12-09T14:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.160357 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:51 crc kubenswrapper[4555]: E1209 14:58:51.160455 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.160518 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:58:51 crc kubenswrapper[4555]: E1209 14:58:51.160616 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.160630 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:58:51 crc kubenswrapper[4555]: E1209 14:58:51.160847 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.167439 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.167491 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.167502 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.167514 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.167521 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:51Z","lastTransitionTime":"2025-12-09T14:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.270030 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.270084 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.270092 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.270106 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.270115 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:51Z","lastTransitionTime":"2025-12-09T14:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.372976 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.373025 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.373034 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.373048 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.373056 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:51Z","lastTransitionTime":"2025-12-09T14:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.475604 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.475638 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.475647 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.475660 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.475670 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:51Z","lastTransitionTime":"2025-12-09T14:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.577583 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.577616 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.577625 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.577638 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.577647 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:51Z","lastTransitionTime":"2025-12-09T14:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.680032 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.680067 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.680076 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.680089 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.680099 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:51Z","lastTransitionTime":"2025-12-09T14:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.781864 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.781892 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.781901 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.781911 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.781921 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:51Z","lastTransitionTime":"2025-12-09T14:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.883440 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.883464 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.883472 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.883483 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.883489 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:51Z","lastTransitionTime":"2025-12-09T14:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.985235 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.985266 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.985275 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.985286 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:51 crc kubenswrapper[4555]: I1209 14:58:51.985294 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:51Z","lastTransitionTime":"2025-12-09T14:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.086868 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.086912 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.086922 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.086936 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.086948 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:52Z","lastTransitionTime":"2025-12-09T14:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.160366 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:58:52 crc kubenswrapper[4555]: E1209 14:58:52.160483 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.188427 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.188453 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.188462 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.188472 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.188482 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:52Z","lastTransitionTime":"2025-12-09T14:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.290167 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.290228 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.290238 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.290251 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.290260 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:52Z","lastTransitionTime":"2025-12-09T14:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.392394 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.392438 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.392449 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.392463 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.392473 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:52Z","lastTransitionTime":"2025-12-09T14:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.494422 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.494455 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.494464 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.494476 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.494485 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:52Z","lastTransitionTime":"2025-12-09T14:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.596431 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.596483 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.596495 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.596507 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.596516 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:52Z","lastTransitionTime":"2025-12-09T14:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.698468 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.698510 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.698521 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.698540 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.698554 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:52Z","lastTransitionTime":"2025-12-09T14:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.799985 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.800119 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.800190 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.800265 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.800367 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:52Z","lastTransitionTime":"2025-12-09T14:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.902546 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.902569 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.902578 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.902589 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:52 crc kubenswrapper[4555]: I1209 14:58:52.902599 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:52Z","lastTransitionTime":"2025-12-09T14:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.004074 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.004165 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.004241 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.004318 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.004386 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:53Z","lastTransitionTime":"2025-12-09T14:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.106436 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.106467 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.106498 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.106511 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.106522 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:53Z","lastTransitionTime":"2025-12-09T14:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.160344 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.160409 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.160371 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:58:53 crc kubenswrapper[4555]: E1209 14:58:53.160494 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:58:53 crc kubenswrapper[4555]: E1209 14:58:53.160574 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:58:53 crc kubenswrapper[4555]: E1209 14:58:53.160653 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.208608 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.208652 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.208664 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.208685 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.208700 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:53Z","lastTransitionTime":"2025-12-09T14:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.310595 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.310628 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.310654 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.310667 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.310675 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:53Z","lastTransitionTime":"2025-12-09T14:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.413100 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.413137 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.413145 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.413158 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.413165 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:53Z","lastTransitionTime":"2025-12-09T14:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.515616 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.515662 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.515673 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.515690 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.515702 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:53Z","lastTransitionTime":"2025-12-09T14:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.618009 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.618048 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.618061 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.618076 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.618089 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:53Z","lastTransitionTime":"2025-12-09T14:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.719924 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.719952 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.719962 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.719974 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.719985 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:53Z","lastTransitionTime":"2025-12-09T14:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.821906 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.821944 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.821957 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.821971 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.821984 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:53Z","lastTransitionTime":"2025-12-09T14:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.923155 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.923202 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.923215 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.923229 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:53 crc kubenswrapper[4555]: I1209 14:58:53.923237 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:53Z","lastTransitionTime":"2025-12-09T14:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.025353 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.025491 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.025555 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.025612 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.025661 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:54Z","lastTransitionTime":"2025-12-09T14:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.127907 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.127950 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.127960 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.127974 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.127987 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:54Z","lastTransitionTime":"2025-12-09T14:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.160920 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:58:54 crc kubenswrapper[4555]: E1209 14:58:54.161050 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.169219 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/35f7e7bd-b667-42fa-b040-4a186e4570e3-metrics-certs\") pod \"network-metrics-daemon-6sc2q\" (UID: \"35f7e7bd-b667-42fa-b040-4a186e4570e3\") " pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:58:54 crc kubenswrapper[4555]: E1209 14:58:54.169363 4555 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:58:54 crc kubenswrapper[4555]: E1209 14:58:54.169448 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35f7e7bd-b667-42fa-b040-4a186e4570e3-metrics-certs podName:35f7e7bd-b667-42fa-b040-4a186e4570e3 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:02.169425011 +0000 UTC m=+52.306137739 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/35f7e7bd-b667-42fa-b040-4a186e4570e3-metrics-certs") pod "network-metrics-daemon-6sc2q" (UID: "35f7e7bd-b667-42fa-b040-4a186e4570e3") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.229376 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.229403 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.229413 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.229423 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.229432 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:54Z","lastTransitionTime":"2025-12-09T14:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.330497 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.330541 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.330551 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.330561 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.330569 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:54Z","lastTransitionTime":"2025-12-09T14:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.432175 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.432212 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.432221 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.432230 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.432240 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:54Z","lastTransitionTime":"2025-12-09T14:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.534109 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.534135 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.534144 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.534156 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.534164 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:54Z","lastTransitionTime":"2025-12-09T14:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.635575 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.635603 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.635611 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.635621 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.635630 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:54Z","lastTransitionTime":"2025-12-09T14:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.737597 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.737709 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.737776 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.737842 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.737896 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:54Z","lastTransitionTime":"2025-12-09T14:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.839523 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.839564 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.839575 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.839591 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.839601 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:54Z","lastTransitionTime":"2025-12-09T14:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.941505 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.941543 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.941551 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.941565 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:54 crc kubenswrapper[4555]: I1209 14:58:54.941574 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:54Z","lastTransitionTime":"2025-12-09T14:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.043235 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.043268 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.043278 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.043291 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.043313 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:55Z","lastTransitionTime":"2025-12-09T14:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.145135 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.145158 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.145166 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.145184 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.145191 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:55Z","lastTransitionTime":"2025-12-09T14:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.160678 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.160724 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.160678 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:58:55 crc kubenswrapper[4555]: E1209 14:58:55.160770 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:58:55 crc kubenswrapper[4555]: E1209 14:58:55.160850 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:58:55 crc kubenswrapper[4555]: E1209 14:58:55.160931 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.246930 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.246968 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.246978 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.246994 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.247005 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:55Z","lastTransitionTime":"2025-12-09T14:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.348288 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.348336 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.348345 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.348356 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.348364 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:55Z","lastTransitionTime":"2025-12-09T14:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.449801 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.449835 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.449843 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.449855 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.449864 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:55Z","lastTransitionTime":"2025-12-09T14:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.552140 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.552193 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.552202 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.552215 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.552223 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:55Z","lastTransitionTime":"2025-12-09T14:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.654285 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.654349 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.654361 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.654373 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.654382 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:55Z","lastTransitionTime":"2025-12-09T14:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.756582 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.756611 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.756620 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.756633 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.756641 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:55Z","lastTransitionTime":"2025-12-09T14:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.857985 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.858017 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.858026 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.858037 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.858044 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:55Z","lastTransitionTime":"2025-12-09T14:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.959817 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.959845 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.959852 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.959864 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:55 crc kubenswrapper[4555]: I1209 14:58:55.959889 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:55Z","lastTransitionTime":"2025-12-09T14:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.061869 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.061903 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.061912 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.061924 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.061934 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:56Z","lastTransitionTime":"2025-12-09T14:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.161065 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:58:56 crc kubenswrapper[4555]: E1209 14:58:56.161187 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.163442 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.163479 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.163489 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.163501 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.163510 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:56Z","lastTransitionTime":"2025-12-09T14:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.264934 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.264962 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.264970 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.264982 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.264990 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:56Z","lastTransitionTime":"2025-12-09T14:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.366505 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.366538 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.366547 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.366559 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.366567 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:56Z","lastTransitionTime":"2025-12-09T14:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.467911 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.467939 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.467947 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.467957 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.467965 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:56Z","lastTransitionTime":"2025-12-09T14:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.569893 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.569916 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.569924 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.569934 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.569941 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:56Z","lastTransitionTime":"2025-12-09T14:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.671987 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.672014 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.672021 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.672032 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.672039 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:56Z","lastTransitionTime":"2025-12-09T14:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.774123 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.774160 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.774170 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.774180 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.774192 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:56Z","lastTransitionTime":"2025-12-09T14:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.875815 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.875851 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.875859 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.875871 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.875880 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:56Z","lastTransitionTime":"2025-12-09T14:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.977415 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.977458 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.977466 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.977475 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:56 crc kubenswrapper[4555]: I1209 14:58:56.977483 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:56Z","lastTransitionTime":"2025-12-09T14:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.079791 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.079838 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.079850 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.079861 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.079869 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:57Z","lastTransitionTime":"2025-12-09T14:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.160296 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.160348 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.160378 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:58:57 crc kubenswrapper[4555]: E1209 14:58:57.160470 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:58:57 crc kubenswrapper[4555]: E1209 14:58:57.160535 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:58:57 crc kubenswrapper[4555]: E1209 14:58:57.160737 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.160971 4555 scope.go:117] "RemoveContainer" containerID="5901d1e7ebd3d979587ab0b065c05f639de1f07b2139edac2d4cf80c289503ff" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.181124 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.181268 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.181278 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.181289 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.181297 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:57Z","lastTransitionTime":"2025-12-09T14:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.283034 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.283065 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.283074 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.283087 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.283095 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:57Z","lastTransitionTime":"2025-12-09T14:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.339805 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p4bl4_c87d128c-39da-4880-9efc-7d2e6319a854/ovnkube-controller/1.log" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.342619 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" event={"ID":"c87d128c-39da-4880-9efc-7d2e6319a854","Type":"ContainerStarted","Data":"de4d6b9f459f3e66cc78f2c00c0e4c9a39554ea732d4f2e1d2b85175855ef2bd"} Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.342705 4555 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.353049 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:57Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.367498 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:57Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.382954 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:57Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.385895 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.385917 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.385944 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.385974 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.385983 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:57Z","lastTransitionTime":"2025-12-09T14:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.392984 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:57Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.410786 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4d6b9f459f3e66cc78f2c00c0e4c9a39554ea732d4f2e1d2b85175855ef2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5901d1e7ebd3d979587ab0b065c05f639de1f07b2139edac2d4cf80c289503ff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"1.918808 5946 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-p4bl4 after 0 failed attempt(s)\\\\nI1209 14:58:41.919392 5946 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-p4bl4\\\\nI1209 14:58:41.918113 5946 services_controller.go:434] Service openshift-cluster-version/cluster-version-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{cluster-version-operator openshift-cluster-version ddf4933a-f532-4906-9b8f-3b15aa433264 6187 0 2025-02-23 05:11:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:cluster-version-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true kubernetes.io/description:Expose cluster-version operator metrics to other in-cluster consumers. Access requires a prometheus-k8s RoleBinding in this namespace. service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:cluster-version-operator-serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0006c6a07 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Nam\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:57Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.419836 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-72msc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4f13f82-a44f-461d-93b0-916a7ed0b925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06c787dc46c305e7388f67f85601eb3e724c501d4da0504d4f8fd569f533c634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2spqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-72msc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:57Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.432739 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:57Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.441147 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:57Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.449394 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:57Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.456360 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:57Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.465461 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ae3cc97b46fa36f9bd6f439124d81700af58cb44671d46ffa2936260f6dda4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:57Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.475027 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae951ddc-0e90-497c-af85-8e39088f32d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59535f0584581ae9d42fa3c1d9f4f204c8a2c7c96d211a2b8d74806eb3901a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a5e10f192fc24405b980ce611a09f1e0c6d49c4c94df89bceef151a747f1aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qchk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:57Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.486523 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:57Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.487617 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.487643 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.487652 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.487664 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.487672 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:57Z","lastTransitionTime":"2025-12-09T14:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.498436 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:57Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.506411 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:57Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.513548 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sc2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f7e7bd-b667-42fa-b040-4a186e4570e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sc2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:57Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.523535 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:57Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.589518 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.589551 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.589559 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.589572 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.589580 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:57Z","lastTransitionTime":"2025-12-09T14:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.691559 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.691589 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.691597 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.691612 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.691620 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:57Z","lastTransitionTime":"2025-12-09T14:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.793249 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.793294 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.793315 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.793330 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.793339 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:57Z","lastTransitionTime":"2025-12-09T14:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.895265 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.895330 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.895340 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.895351 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.895360 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:57Z","lastTransitionTime":"2025-12-09T14:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.996839 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.996870 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.996878 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.996889 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:57 crc kubenswrapper[4555]: I1209 14:58:57.996897 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:57Z","lastTransitionTime":"2025-12-09T14:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.097921 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.097950 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.097958 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.097969 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.097978 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:58Z","lastTransitionTime":"2025-12-09T14:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.160183 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:58:58 crc kubenswrapper[4555]: E1209 14:58:58.160274 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.199499 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.199521 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.199528 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.199539 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.199546 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:58Z","lastTransitionTime":"2025-12-09T14:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.301553 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.301579 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.301588 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.301597 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.301603 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:58Z","lastTransitionTime":"2025-12-09T14:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.345650 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p4bl4_c87d128c-39da-4880-9efc-7d2e6319a854/ovnkube-controller/2.log" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.346118 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p4bl4_c87d128c-39da-4880-9efc-7d2e6319a854/ovnkube-controller/1.log" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.348775 4555 generic.go:334] "Generic (PLEG): container finished" podID="c87d128c-39da-4880-9efc-7d2e6319a854" containerID="de4d6b9f459f3e66cc78f2c00c0e4c9a39554ea732d4f2e1d2b85175855ef2bd" exitCode=1 Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.348811 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" event={"ID":"c87d128c-39da-4880-9efc-7d2e6319a854","Type":"ContainerDied","Data":"de4d6b9f459f3e66cc78f2c00c0e4c9a39554ea732d4f2e1d2b85175855ef2bd"} Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.348865 4555 scope.go:117] "RemoveContainer" containerID="5901d1e7ebd3d979587ab0b065c05f639de1f07b2139edac2d4cf80c289503ff" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.349812 4555 scope.go:117] "RemoveContainer" containerID="de4d6b9f459f3e66cc78f2c00c0e4c9a39554ea732d4f2e1d2b85175855ef2bd" Dec 09 14:58:58 crc kubenswrapper[4555]: E1209 14:58:58.350164 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p4bl4_openshift-ovn-kubernetes(c87d128c-39da-4880-9efc-7d2e6319a854)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.359771 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:58Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.367508 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:58Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.375043 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:58Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.381934 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sc2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f7e7bd-b667-42fa-b040-4a186e4570e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sc2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:58Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.388765 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-72msc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4f13f82-a44f-461d-93b0-916a7ed0b925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06c787dc46c305e7388f67f85601eb3e724c501d4da0504d4f8fd569f533c634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2spqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-72msc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:58Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.401694 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:58Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.402805 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.402838 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.402865 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.402878 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.402886 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:58Z","lastTransitionTime":"2025-12-09T14:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.410004 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:58Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.417498 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:58Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.424210 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:58Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.431702 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:58Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.443278 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4d6b9f459f3e66cc78f2c00c0e4c9a39554ea732d4f2e1d2b85175855ef2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5901d1e7ebd3d979587ab0b065c05f639de1f07b2139edac2d4cf80c289503ff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"1.918808 5946 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-p4bl4 after 0 failed attempt(s)\\\\nI1209 14:58:41.919392 5946 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-p4bl4\\\\nI1209 14:58:41.918113 5946 services_controller.go:434] Service openshift-cluster-version/cluster-version-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{cluster-version-operator openshift-cluster-version ddf4933a-f532-4906-9b8f-3b15aa433264 6187 0 2025-02-23 05:11:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:cluster-version-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true kubernetes.io/description:Expose cluster-version operator metrics to other in-cluster consumers. Access requires a prometheus-k8s RoleBinding in this namespace. service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:cluster-version-operator-serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0006c6a07 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Nam\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de4d6b9f459f3e66cc78f2c00c0e4c9a39554ea732d4f2e1d2b85175855ef2bd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:58:57Z\\\",\\\"message\\\":\\\"ice k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:58:57.738539 6180 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/api]} name:Service_openshift-apiserver/api_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.37:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {88e20c31-5b8d-4d44-bbd8-dba87b7dbaf0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:58:57.738561 6180 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:58Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.450539 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:58Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.458229 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:58Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.464957 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:58Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.473578 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ae3cc97b46fa36f9bd6f439124d81700af58cb44671d46ffa2936260f6dda4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:58Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.480318 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae951ddc-0e90-497c-af85-8e39088f32d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59535f0584581ae9d42fa3c1d9f4f204c8a2c7c96d211a2b8d74806eb3901a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a5e10f192fc24405b980ce611a09f1e0c6d49c4c94df89bceef151a747f1aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qchk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:58Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.486203 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:58Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.504368 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.504396 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.504404 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.504416 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.504424 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:58Z","lastTransitionTime":"2025-12-09T14:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.605773 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.605807 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.605816 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.605827 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.605837 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:58Z","lastTransitionTime":"2025-12-09T14:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.707580 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.707751 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.707812 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.707865 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.707929 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:58Z","lastTransitionTime":"2025-12-09T14:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.810070 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.810095 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.810104 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.810115 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.810133 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:58Z","lastTransitionTime":"2025-12-09T14:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.911082 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.911114 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.911136 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.911149 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:58 crc kubenswrapper[4555]: I1209 14:58:58.911157 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:58Z","lastTransitionTime":"2025-12-09T14:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.011582 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.011647 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.011690 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:59 crc kubenswrapper[4555]: E1209 14:58:59.011743 4555 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:58:59 crc kubenswrapper[4555]: E1209 14:58:59.011778 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:31.011767609 +0000 UTC m=+81.148480326 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:58:59 crc kubenswrapper[4555]: E1209 14:58:59.011880 4555 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:58:59 crc kubenswrapper[4555]: E1209 14:58:59.011943 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:31.011925385 +0000 UTC m=+81.148638123 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:58:59 crc kubenswrapper[4555]: E1209 14:58:59.012112 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 14:59:31.01209306 +0000 UTC m=+81.148805777 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.013420 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.013442 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.013450 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.013459 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.013467 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:59Z","lastTransitionTime":"2025-12-09T14:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.112448 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.112507 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:58:59 crc kubenswrapper[4555]: E1209 14:58:59.112549 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:58:59 crc kubenswrapper[4555]: E1209 14:58:59.112571 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:58:59 crc kubenswrapper[4555]: E1209 14:58:59.112583 4555 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:59 crc kubenswrapper[4555]: E1209 14:58:59.112587 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:58:59 crc kubenswrapper[4555]: E1209 14:58:59.112608 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:58:59 crc kubenswrapper[4555]: E1209 14:58:59.112618 4555 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:59 crc kubenswrapper[4555]: E1209 14:58:59.112625 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:31.112613119 +0000 UTC m=+81.249325836 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:59 crc kubenswrapper[4555]: E1209 14:58:59.112646 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:31.112637093 +0000 UTC m=+81.249349812 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.115172 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.115195 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.115205 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.115218 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.115227 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:59Z","lastTransitionTime":"2025-12-09T14:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.160892 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.160892 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:58:59 crc kubenswrapper[4555]: E1209 14:58:59.161761 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.161788 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:58:59 crc kubenswrapper[4555]: E1209 14:58:59.161841 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:58:59 crc kubenswrapper[4555]: E1209 14:58:59.161913 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.216637 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.216669 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.216677 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.216687 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.216695 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:59Z","lastTransitionTime":"2025-12-09T14:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.318447 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.318487 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.318495 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.318504 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.318513 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:59Z","lastTransitionTime":"2025-12-09T14:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.337890 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.344292 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.347641 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:59Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.351350 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p4bl4_c87d128c-39da-4880-9efc-7d2e6319a854/ovnkube-controller/2.log" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.355643 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:59Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.363205 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:59Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.370003 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sc2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f7e7bd-b667-42fa-b040-4a186e4570e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sc2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:59Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.376815 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:59Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.384450 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:59Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.397339 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4d6b9f459f3e66cc78f2c00c0e4c9a39554ea732d4f2e1d2b85175855ef2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5901d1e7ebd3d979587ab0b065c05f639de1f07b2139edac2d4cf80c289503ff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"1.918808 5946 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-p4bl4 after 0 failed attempt(s)\\\\nI1209 14:58:41.919392 5946 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-p4bl4\\\\nI1209 14:58:41.918113 5946 services_controller.go:434] Service openshift-cluster-version/cluster-version-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{cluster-version-operator openshift-cluster-version ddf4933a-f532-4906-9b8f-3b15aa433264 6187 0 2025-02-23 05:11:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:cluster-version-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true kubernetes.io/description:Expose cluster-version operator metrics to other in-cluster consumers. Access requires a prometheus-k8s RoleBinding in this namespace. service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:cluster-version-operator-serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0006c6a07 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Nam\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de4d6b9f459f3e66cc78f2c00c0e4c9a39554ea732d4f2e1d2b85175855ef2bd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:58:57Z\\\",\\\"message\\\":\\\"ice k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:58:57.738539 6180 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/api]} name:Service_openshift-apiserver/api_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.37:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {88e20c31-5b8d-4d44-bbd8-dba87b7dbaf0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:58:57.738561 6180 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:59Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.403669 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-72msc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4f13f82-a44f-461d-93b0-916a7ed0b925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06c787dc46c305e7388f67f85601eb3e724c501d4da0504d4f8fd569f533c634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2spqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-72msc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:59Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.416381 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:59Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.420433 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.420476 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.420486 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.420498 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.420505 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:59Z","lastTransitionTime":"2025-12-09T14:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.425099 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:59Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.432195 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:59Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.440620 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ae3cc97b46fa36f9bd6f439124d81700af58cb44671d46ffa2936260f6dda4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:59Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.447064 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae951ddc-0e90-497c-af85-8e39088f32d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59535f0584581ae9d42fa3c1d9f4f204c8a2c7c96d211a2b8d74806eb3901a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a5e10f192fc24405b980ce611a09f1e0c6d49c4c94df89bceef151a747f1aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qchk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:59Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.454125 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:59Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.461521 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:59Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.468288 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:59Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.474200 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:58:59Z is after 2025-08-24T17:21:41Z" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.521789 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.521810 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.521818 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.521827 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.521835 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:59Z","lastTransitionTime":"2025-12-09T14:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.623483 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.623528 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.623537 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.623547 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.623555 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:59Z","lastTransitionTime":"2025-12-09T14:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.724704 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.724745 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.724753 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.724763 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.724771 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:59Z","lastTransitionTime":"2025-12-09T14:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.826523 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.826562 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.826570 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.826585 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.826595 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:59Z","lastTransitionTime":"2025-12-09T14:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.927750 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.927784 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.927793 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.927805 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:58:59 crc kubenswrapper[4555]: I1209 14:58:59.927813 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:58:59Z","lastTransitionTime":"2025-12-09T14:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.029226 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.029250 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.029258 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.029269 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.029276 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:00Z","lastTransitionTime":"2025-12-09T14:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.130458 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.130489 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.130497 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.130509 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.130517 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:00Z","lastTransitionTime":"2025-12-09T14:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.161037 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:00 crc kubenswrapper[4555]: E1209 14:59:00.161153 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.168865 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.175626 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.184161 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ae3cc97b46fa36f9bd6f439124d81700af58cb44671d46ffa2936260f6dda4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.191598 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae951ddc-0e90-497c-af85-8e39088f32d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59535f0584581ae9d42fa3c1d9f4f204c8a2c7c96d211a2b8d74806eb3901a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a5e10f192fc24405b980ce611a09f1e0c6d49c4c94df89bceef151a747f1aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qchk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.198690 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.204527 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.211890 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.220034 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sc2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f7e7bd-b667-42fa-b040-4a186e4570e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sc2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.228241 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.232513 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.232541 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.232551 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.232564 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.232574 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:00Z","lastTransitionTime":"2025-12-09T14:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.236477 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.244388 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.251767 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.259198 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.267569 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.279204 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4d6b9f459f3e66cc78f2c00c0e4c9a39554ea732d4f2e1d2b85175855ef2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5901d1e7ebd3d979587ab0b065c05f639de1f07b2139edac2d4cf80c289503ff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"1.918808 5946 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-p4bl4 after 0 failed attempt(s)\\\\nI1209 14:58:41.919392 5946 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-p4bl4\\\\nI1209 14:58:41.918113 5946 services_controller.go:434] Service openshift-cluster-version/cluster-version-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{cluster-version-operator openshift-cluster-version ddf4933a-f532-4906-9b8f-3b15aa433264 6187 0 2025-02-23 05:11:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:cluster-version-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true kubernetes.io/description:Expose cluster-version operator metrics to other in-cluster consumers. Access requires a prometheus-k8s RoleBinding in this namespace. service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:cluster-version-operator-serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0006c6a07 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Nam\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de4d6b9f459f3e66cc78f2c00c0e4c9a39554ea732d4f2e1d2b85175855ef2bd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:58:57Z\\\",\\\"message\\\":\\\"ice k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:58:57.738539 6180 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/api]} name:Service_openshift-apiserver/api_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.37:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {88e20c31-5b8d-4d44-bbd8-dba87b7dbaf0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:58:57.738561 6180 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.286749 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-72msc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4f13f82-a44f-461d-93b0-916a7ed0b925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06c787dc46c305e7388f67f85601eb3e724c501d4da0504d4f8fd569f533c634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2spqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-72msc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.298953 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.306529 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffcfa1a3-f24c-4809-8468-636ab9b1a063\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c1f863d399da019632ceb8d698cf8aa186d0f533ff8a49da7c66dd11086253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa09f69008b086bb69e768437b40d896803a11f82f6949202ed0a456f3a213be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b44fa2c57434247fea11c1e1fbed34ad1df532e7e6bba0623634541322627f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a30d185f354ca39197fe30559783400a7d218ef0a162fd5fa5e5039d1946d127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30d185f354ca39197fe30559783400a7d218ef0a162fd5fa5e5039d1946d127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.333691 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.333722 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.333732 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.333744 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.333753 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:00Z","lastTransitionTime":"2025-12-09T14:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.435333 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.435499 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.435608 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.435681 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.435800 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:00Z","lastTransitionTime":"2025-12-09T14:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.537256 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.537281 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.537290 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.537319 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.537329 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:00Z","lastTransitionTime":"2025-12-09T14:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.602936 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.603036 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.603103 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.603178 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.603245 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:00Z","lastTransitionTime":"2025-12-09T14:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:00 crc kubenswrapper[4555]: E1209 14:59:00.611123 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.613848 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.613930 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.613997 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.614050 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.614107 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:00Z","lastTransitionTime":"2025-12-09T14:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:00 crc kubenswrapper[4555]: E1209 14:59:00.621757 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.623913 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.623986 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.623996 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.624009 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.624017 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:00Z","lastTransitionTime":"2025-12-09T14:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:00 crc kubenswrapper[4555]: E1209 14:59:00.632158 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.634484 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.634507 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.634517 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.634527 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.634534 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:00Z","lastTransitionTime":"2025-12-09T14:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:00 crc kubenswrapper[4555]: E1209 14:59:00.642376 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.644417 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.644523 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.644601 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.644757 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.644883 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:00Z","lastTransitionTime":"2025-12-09T14:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:00 crc kubenswrapper[4555]: E1209 14:59:00.653425 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4555]: E1209 14:59:00.653642 4555 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.654512 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.654537 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.654546 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.654555 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.654561 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:00Z","lastTransitionTime":"2025-12-09T14:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.756265 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.756378 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.756437 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.756499 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.756555 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:00Z","lastTransitionTime":"2025-12-09T14:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.859004 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.859048 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.859060 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.859074 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.859103 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:00Z","lastTransitionTime":"2025-12-09T14:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.960450 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.960499 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.960507 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.960517 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:00 crc kubenswrapper[4555]: I1209 14:59:00.960525 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:00Z","lastTransitionTime":"2025-12-09T14:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.062643 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.062664 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.062672 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.062681 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.062688 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:01Z","lastTransitionTime":"2025-12-09T14:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.160290 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:01 crc kubenswrapper[4555]: E1209 14:59:01.160413 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.160432 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.160455 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:01 crc kubenswrapper[4555]: E1209 14:59:01.160520 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:01 crc kubenswrapper[4555]: E1209 14:59:01.160554 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.164541 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.164566 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.164574 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.164584 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.164592 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:01Z","lastTransitionTime":"2025-12-09T14:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.266572 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.266598 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.266605 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.266614 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.266622 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:01Z","lastTransitionTime":"2025-12-09T14:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.367479 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.367504 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.367512 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.367521 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.367529 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:01Z","lastTransitionTime":"2025-12-09T14:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.469145 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.469166 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.469174 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.469182 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.469189 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:01Z","lastTransitionTime":"2025-12-09T14:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.571036 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.571060 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.571069 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.571091 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.571099 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:01Z","lastTransitionTime":"2025-12-09T14:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.587632 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.588128 4555 scope.go:117] "RemoveContainer" containerID="de4d6b9f459f3e66cc78f2c00c0e4c9a39554ea732d4f2e1d2b85175855ef2bd" Dec 09 14:59:01 crc kubenswrapper[4555]: E1209 14:59:01.588244 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p4bl4_openshift-ovn-kubernetes(c87d128c-39da-4880-9efc-7d2e6319a854)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.596652 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.603323 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sc2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f7e7bd-b667-42fa-b040-4a186e4570e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sc2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.611767 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.620247 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.628721 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.636028 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.643060 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.651327 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.662894 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4d6b9f459f3e66cc78f2c00c0e4c9a39554ea732d4f2e1d2b85175855ef2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de4d6b9f459f3e66cc78f2c00c0e4c9a39554ea732d4f2e1d2b85175855ef2bd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:58:57Z\\\",\\\"message\\\":\\\"ice k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:58:57.738539 6180 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/api]} name:Service_openshift-apiserver/api_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.37:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {88e20c31-5b8d-4d44-bbd8-dba87b7dbaf0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:58:57.738561 6180 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p4bl4_openshift-ovn-kubernetes(c87d128c-39da-4880-9efc-7d2e6319a854)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.669052 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-72msc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4f13f82-a44f-461d-93b0-916a7ed0b925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06c787dc46c305e7388f67f85601eb3e724c501d4da0504d4f8fd569f533c634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2spqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-72msc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.672424 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.672450 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.672459 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.672468 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.672476 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:01Z","lastTransitionTime":"2025-12-09T14:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.681027 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.687997 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffcfa1a3-f24c-4809-8468-636ab9b1a063\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c1f863d399da019632ceb8d698cf8aa186d0f533ff8a49da7c66dd11086253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa09f69008b086bb69e768437b40d896803a11f82f6949202ed0a456f3a213be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b44fa2c57434247fea11c1e1fbed34ad1df532e7e6bba0623634541322627f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a30d185f354ca39197fe30559783400a7d218ef0a162fd5fa5e5039d1946d127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30d185f354ca39197fe30559783400a7d218ef0a162fd5fa5e5039d1946d127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.695512 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.702370 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.711129 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ae3cc97b46fa36f9bd6f439124d81700af58cb44671d46ffa2936260f6dda4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.718157 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae951ddc-0e90-497c-af85-8e39088f32d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59535f0584581ae9d42fa3c1d9f4f204c8a2c7c96d211a2b8d74806eb3901a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a5e10f192fc24405b980ce611a09f1e0c6d49c4c94df89bceef151a747f1aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qchk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.725474 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.731438 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.773892 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.773917 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.773926 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.773937 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.773946 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:01Z","lastTransitionTime":"2025-12-09T14:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.875702 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.875730 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.875738 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.875749 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.875767 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:01Z","lastTransitionTime":"2025-12-09T14:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.977420 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.977453 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.977464 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.977475 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:01 crc kubenswrapper[4555]: I1209 14:59:01.977484 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:01Z","lastTransitionTime":"2025-12-09T14:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.079230 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.079252 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.079260 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.079270 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.079277 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:02Z","lastTransitionTime":"2025-12-09T14:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.160843 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:02 crc kubenswrapper[4555]: E1209 14:59:02.160946 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.180352 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.180390 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.180399 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.180413 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.180423 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:02Z","lastTransitionTime":"2025-12-09T14:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.235889 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/35f7e7bd-b667-42fa-b040-4a186e4570e3-metrics-certs\") pod \"network-metrics-daemon-6sc2q\" (UID: \"35f7e7bd-b667-42fa-b040-4a186e4570e3\") " pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:02 crc kubenswrapper[4555]: E1209 14:59:02.236058 4555 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:59:02 crc kubenswrapper[4555]: E1209 14:59:02.236130 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35f7e7bd-b667-42fa-b040-4a186e4570e3-metrics-certs podName:35f7e7bd-b667-42fa-b040-4a186e4570e3 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:18.236113946 +0000 UTC m=+68.372826674 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/35f7e7bd-b667-42fa-b040-4a186e4570e3-metrics-certs") pod "network-metrics-daemon-6sc2q" (UID: "35f7e7bd-b667-42fa-b040-4a186e4570e3") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.282222 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.282248 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.282257 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.282270 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.282278 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:02Z","lastTransitionTime":"2025-12-09T14:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.384182 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.384216 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.384225 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.384236 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.384246 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:02Z","lastTransitionTime":"2025-12-09T14:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.486145 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.486179 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.486190 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.486201 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.486210 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:02Z","lastTransitionTime":"2025-12-09T14:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.588283 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.588343 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.588355 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.588367 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.588375 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:02Z","lastTransitionTime":"2025-12-09T14:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.689589 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.689617 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.689627 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.689639 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.689647 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:02Z","lastTransitionTime":"2025-12-09T14:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.791531 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.791563 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.791571 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.791583 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.791591 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:02Z","lastTransitionTime":"2025-12-09T14:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.892987 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.893018 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.893026 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.893037 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.893046 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:02Z","lastTransitionTime":"2025-12-09T14:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.995001 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.995044 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.995054 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.995074 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:02 crc kubenswrapper[4555]: I1209 14:59:02.995088 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:02Z","lastTransitionTime":"2025-12-09T14:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.096712 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.096736 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.096744 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.096756 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.096764 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:03Z","lastTransitionTime":"2025-12-09T14:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.160728 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.160763 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.160805 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:03 crc kubenswrapper[4555]: E1209 14:59:03.160922 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:03 crc kubenswrapper[4555]: E1209 14:59:03.160999 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:03 crc kubenswrapper[4555]: E1209 14:59:03.161097 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.197916 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.198071 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.198081 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.198093 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.198103 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:03Z","lastTransitionTime":"2025-12-09T14:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.300010 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.300040 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.300048 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.300066 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.300091 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:03Z","lastTransitionTime":"2025-12-09T14:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.402006 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.402041 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.402068 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.402081 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.402090 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:03Z","lastTransitionTime":"2025-12-09T14:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.503861 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.503889 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.503898 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.503910 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.503918 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:03Z","lastTransitionTime":"2025-12-09T14:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.605525 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.605560 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.605569 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.605581 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.605589 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:03Z","lastTransitionTime":"2025-12-09T14:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.707150 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.707179 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.707189 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.707201 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.707208 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:03Z","lastTransitionTime":"2025-12-09T14:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.809074 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.809104 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.809113 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.809127 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.809134 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:03Z","lastTransitionTime":"2025-12-09T14:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.910809 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.910855 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.910866 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.910879 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:03 crc kubenswrapper[4555]: I1209 14:59:03.910887 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:03Z","lastTransitionTime":"2025-12-09T14:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.012444 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.012503 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.012514 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.012526 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.012538 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:04Z","lastTransitionTime":"2025-12-09T14:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.114270 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.114316 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.114326 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.114339 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.114348 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:04Z","lastTransitionTime":"2025-12-09T14:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.160249 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:04 crc kubenswrapper[4555]: E1209 14:59:04.160361 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.216462 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.216495 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.216505 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.216517 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.216525 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:04Z","lastTransitionTime":"2025-12-09T14:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.318371 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.318400 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.318408 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.318418 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.318427 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:04Z","lastTransitionTime":"2025-12-09T14:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.419885 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.419914 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.419923 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.419933 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.419941 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:04Z","lastTransitionTime":"2025-12-09T14:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.521187 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.521214 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.521222 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.521232 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.521239 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:04Z","lastTransitionTime":"2025-12-09T14:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.622848 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.622879 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.622888 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.622898 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.622906 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:04Z","lastTransitionTime":"2025-12-09T14:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.724540 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.724633 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.724689 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.724744 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.724794 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:04Z","lastTransitionTime":"2025-12-09T14:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.826472 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.826504 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.826514 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.826527 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.826535 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:04Z","lastTransitionTime":"2025-12-09T14:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.928208 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.928242 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.928252 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.928266 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:04 crc kubenswrapper[4555]: I1209 14:59:04.928275 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:04Z","lastTransitionTime":"2025-12-09T14:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.030052 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.030098 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.030108 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.030118 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.030125 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:05Z","lastTransitionTime":"2025-12-09T14:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.131536 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.131560 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.131568 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.131577 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.131583 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:05Z","lastTransitionTime":"2025-12-09T14:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.160911 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:05 crc kubenswrapper[4555]: E1209 14:59:05.160980 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.160994 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.161071 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:05 crc kubenswrapper[4555]: E1209 14:59:05.161101 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:05 crc kubenswrapper[4555]: E1209 14:59:05.161184 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.233511 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.233536 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.233544 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.233555 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.233580 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:05Z","lastTransitionTime":"2025-12-09T14:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.335564 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.335588 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.335602 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.335613 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.335621 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:05Z","lastTransitionTime":"2025-12-09T14:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.436873 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.436900 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.436909 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.436917 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.436925 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:05Z","lastTransitionTime":"2025-12-09T14:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.538597 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.538633 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.538643 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.538656 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.538667 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:05Z","lastTransitionTime":"2025-12-09T14:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.640765 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.640791 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.640799 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.640810 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.640818 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:05Z","lastTransitionTime":"2025-12-09T14:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.742499 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.742536 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.742546 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.742560 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.742569 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:05Z","lastTransitionTime":"2025-12-09T14:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.844628 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.844653 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.844662 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.844675 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.844682 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:05Z","lastTransitionTime":"2025-12-09T14:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.946212 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.946232 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.946239 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.946248 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:05 crc kubenswrapper[4555]: I1209 14:59:05.946255 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:05Z","lastTransitionTime":"2025-12-09T14:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.048211 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.048232 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.048242 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.048260 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.048269 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:06Z","lastTransitionTime":"2025-12-09T14:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.149880 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.149904 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.149912 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.149921 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.149928 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:06Z","lastTransitionTime":"2025-12-09T14:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.160537 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:06 crc kubenswrapper[4555]: E1209 14:59:06.160631 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.251570 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.251595 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.251602 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.251613 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.251620 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:06Z","lastTransitionTime":"2025-12-09T14:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.353181 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.353396 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.353471 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.353537 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.353588 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:06Z","lastTransitionTime":"2025-12-09T14:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.455106 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.455159 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.455170 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.455181 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.455191 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:06Z","lastTransitionTime":"2025-12-09T14:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.556524 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.556550 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.556558 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.556568 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.556592 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:06Z","lastTransitionTime":"2025-12-09T14:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.658127 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.658152 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.658159 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.658168 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.658175 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:06Z","lastTransitionTime":"2025-12-09T14:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.760097 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.760121 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.760128 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.760137 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.760144 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:06Z","lastTransitionTime":"2025-12-09T14:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.861882 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.861906 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.861914 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.861923 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.861930 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:06Z","lastTransitionTime":"2025-12-09T14:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.965390 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.965570 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.965715 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.965729 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:06 crc kubenswrapper[4555]: I1209 14:59:06.965746 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:06Z","lastTransitionTime":"2025-12-09T14:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.067273 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.067322 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.067332 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.067343 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.067350 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:07Z","lastTransitionTime":"2025-12-09T14:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.160481 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.160498 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.160570 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:07 crc kubenswrapper[4555]: E1209 14:59:07.160691 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:07 crc kubenswrapper[4555]: E1209 14:59:07.160756 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:07 crc kubenswrapper[4555]: E1209 14:59:07.160824 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.168755 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.168777 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.168785 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.168794 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.168801 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:07Z","lastTransitionTime":"2025-12-09T14:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.270851 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.270876 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.270884 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.270894 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.270900 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:07Z","lastTransitionTime":"2025-12-09T14:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.372457 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.372504 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.372513 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.372524 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.372532 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:07Z","lastTransitionTime":"2025-12-09T14:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.474930 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.474953 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.474964 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.474974 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.474982 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:07Z","lastTransitionTime":"2025-12-09T14:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.577200 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.577225 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.577233 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.577242 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.577250 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:07Z","lastTransitionTime":"2025-12-09T14:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.679089 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.679125 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.679134 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.679167 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.679175 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:07Z","lastTransitionTime":"2025-12-09T14:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.780929 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.780955 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.780962 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.780974 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.780981 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:07Z","lastTransitionTime":"2025-12-09T14:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.882809 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.882857 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.882866 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.882877 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.882885 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:07Z","lastTransitionTime":"2025-12-09T14:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.984180 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.984207 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.984216 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.984226 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:07 crc kubenswrapper[4555]: I1209 14:59:07.984234 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:07Z","lastTransitionTime":"2025-12-09T14:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.085357 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.085409 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.085426 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.085446 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.085458 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:08Z","lastTransitionTime":"2025-12-09T14:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.160278 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:08 crc kubenswrapper[4555]: E1209 14:59:08.160404 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.186830 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.186852 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.186860 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.186870 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.186877 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:08Z","lastTransitionTime":"2025-12-09T14:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.288965 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.288996 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.289006 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.289015 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.289022 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:08Z","lastTransitionTime":"2025-12-09T14:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.391127 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.391184 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.391193 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.391206 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.391214 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:08Z","lastTransitionTime":"2025-12-09T14:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.492464 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.492494 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.492503 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.492513 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.492520 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:08Z","lastTransitionTime":"2025-12-09T14:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.593668 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.593693 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.593702 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.593712 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.593719 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:08Z","lastTransitionTime":"2025-12-09T14:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.694922 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.694947 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.694955 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.694966 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.694974 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:08Z","lastTransitionTime":"2025-12-09T14:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.796797 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.796826 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.796837 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.796860 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.796870 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:08Z","lastTransitionTime":"2025-12-09T14:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.898919 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.898948 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.898957 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.898973 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:08 crc kubenswrapper[4555]: I1209 14:59:08.898989 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:08Z","lastTransitionTime":"2025-12-09T14:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.000749 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.000866 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.000925 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.001002 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.001072 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:09Z","lastTransitionTime":"2025-12-09T14:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.102880 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.102919 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.102927 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.102937 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.102948 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:09Z","lastTransitionTime":"2025-12-09T14:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.160772 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:09 crc kubenswrapper[4555]: E1209 14:59:09.160850 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.160884 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.160918 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:09 crc kubenswrapper[4555]: E1209 14:59:09.160991 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:09 crc kubenswrapper[4555]: E1209 14:59:09.161167 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.205000 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.205026 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.205035 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.205044 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.205052 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:09Z","lastTransitionTime":"2025-12-09T14:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.306369 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.306402 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.306413 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.306426 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.306435 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:09Z","lastTransitionTime":"2025-12-09T14:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.411238 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.411262 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.411270 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.411278 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.411287 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:09Z","lastTransitionTime":"2025-12-09T14:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.513206 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.513236 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.513245 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.513257 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.513266 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:09Z","lastTransitionTime":"2025-12-09T14:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.614691 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.614717 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.614725 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.614735 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.614743 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:09Z","lastTransitionTime":"2025-12-09T14:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.716720 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.716748 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.716757 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.716765 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.716772 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:09Z","lastTransitionTime":"2025-12-09T14:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.818662 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.818694 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.818703 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.818716 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.818725 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:09Z","lastTransitionTime":"2025-12-09T14:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.921062 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.921101 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.921111 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.921125 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:09 crc kubenswrapper[4555]: I1209 14:59:09.921134 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:09Z","lastTransitionTime":"2025-12-09T14:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.023451 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.023480 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.023490 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.023501 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.023509 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:10Z","lastTransitionTime":"2025-12-09T14:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.126282 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.126323 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.126332 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.126344 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.126352 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:10Z","lastTransitionTime":"2025-12-09T14:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.160115 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:10 crc kubenswrapper[4555]: E1209 14:59:10.160226 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.170211 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:10Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.177597 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:10Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.186441 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ae3cc97b46fa36f9bd6f439124d81700af58cb44671d46ffa2936260f6dda4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:10Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.194201 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae951ddc-0e90-497c-af85-8e39088f32d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59535f0584581ae9d42fa3c1d9f4f204c8a2c7c96d211a2b8d74806eb3901a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a5e10f192fc24405b980ce611a09f1e0c6d49c4c94df89bceef151a747f1aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qchk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:10Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.202361 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:10Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.208653 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:10Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.217112 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:10Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.224921 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sc2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f7e7bd-b667-42fa-b040-4a186e4570e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sc2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:10Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.227830 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.227875 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.227884 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.227896 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.227905 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:10Z","lastTransitionTime":"2025-12-09T14:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.234245 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:10Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.242692 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:10Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.250660 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:10Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.258258 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:10Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.265622 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:10Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.273282 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:10Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.285048 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4d6b9f459f3e66cc78f2c00c0e4c9a39554ea732d4f2e1d2b85175855ef2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de4d6b9f459f3e66cc78f2c00c0e4c9a39554ea732d4f2e1d2b85175855ef2bd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:58:57Z\\\",\\\"message\\\":\\\"ice k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:58:57.738539 6180 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/api]} name:Service_openshift-apiserver/api_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.37:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {88e20c31-5b8d-4d44-bbd8-dba87b7dbaf0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:58:57.738561 6180 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p4bl4_openshift-ovn-kubernetes(c87d128c-39da-4880-9efc-7d2e6319a854)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:10Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.292811 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-72msc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4f13f82-a44f-461d-93b0-916a7ed0b925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06c787dc46c305e7388f67f85601eb3e724c501d4da0504d4f8fd569f533c634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2spqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-72msc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:10Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.305384 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:10Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.313668 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffcfa1a3-f24c-4809-8468-636ab9b1a063\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c1f863d399da019632ceb8d698cf8aa186d0f533ff8a49da7c66dd11086253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa09f69008b086bb69e768437b40d896803a11f82f6949202ed0a456f3a213be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b44fa2c57434247fea11c1e1fbed34ad1df532e7e6bba0623634541322627f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a30d185f354ca39197fe30559783400a7d218ef0a162fd5fa5e5039d1946d127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30d185f354ca39197fe30559783400a7d218ef0a162fd5fa5e5039d1946d127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:10Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.329508 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.329543 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.329553 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.329565 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.329574 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:10Z","lastTransitionTime":"2025-12-09T14:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.431206 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.431245 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.431253 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.431266 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.431274 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:10Z","lastTransitionTime":"2025-12-09T14:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.533471 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.533522 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.533536 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.533548 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.533556 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:10Z","lastTransitionTime":"2025-12-09T14:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.635893 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.635922 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.635930 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.635942 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.635949 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:10Z","lastTransitionTime":"2025-12-09T14:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.737562 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.737597 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.737605 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.737619 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.737628 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:10Z","lastTransitionTime":"2025-12-09T14:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.839681 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.839716 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.839724 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.839736 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.839748 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:10Z","lastTransitionTime":"2025-12-09T14:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.941010 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.941041 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.941050 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.941063 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.941072 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:10Z","lastTransitionTime":"2025-12-09T14:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.948062 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.948089 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.948097 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.948106 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.948114 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:10Z","lastTransitionTime":"2025-12-09T14:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:10 crc kubenswrapper[4555]: E1209 14:59:10.956378 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:10Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.958413 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.958443 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.958452 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.958480 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.958491 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:10Z","lastTransitionTime":"2025-12-09T14:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:10 crc kubenswrapper[4555]: E1209 14:59:10.968311 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:10Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.970692 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.970725 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.970734 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.970753 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.970762 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:10Z","lastTransitionTime":"2025-12-09T14:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:10 crc kubenswrapper[4555]: E1209 14:59:10.978608 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:10Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.980624 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.980654 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.980663 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.980675 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.980684 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:10Z","lastTransitionTime":"2025-12-09T14:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:10 crc kubenswrapper[4555]: E1209 14:59:10.988118 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:10Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.990268 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.990290 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.990299 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.990322 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:10 crc kubenswrapper[4555]: I1209 14:59:10.990329 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:10Z","lastTransitionTime":"2025-12-09T14:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:10 crc kubenswrapper[4555]: E1209 14:59:10.998620 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:10Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:10 crc kubenswrapper[4555]: E1209 14:59:10.998752 4555 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.042972 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.042999 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.043009 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.043023 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.043032 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:11Z","lastTransitionTime":"2025-12-09T14:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.144885 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.144919 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.144927 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.144940 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.144958 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:11Z","lastTransitionTime":"2025-12-09T14:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.160628 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:11 crc kubenswrapper[4555]: E1209 14:59:11.160711 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.160633 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:11 crc kubenswrapper[4555]: E1209 14:59:11.160807 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.160635 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:11 crc kubenswrapper[4555]: E1209 14:59:11.160894 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.247077 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.247105 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.247113 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.247126 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.247134 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:11Z","lastTransitionTime":"2025-12-09T14:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.348645 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.348677 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.348686 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.348697 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.348705 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:11Z","lastTransitionTime":"2025-12-09T14:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.450455 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.450489 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.450497 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.450510 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.450520 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:11Z","lastTransitionTime":"2025-12-09T14:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.552123 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.552162 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.552172 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.552188 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.552197 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:11Z","lastTransitionTime":"2025-12-09T14:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.653975 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.654005 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.654014 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.654026 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.654034 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:11Z","lastTransitionTime":"2025-12-09T14:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.756322 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.756350 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.756359 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.756378 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.756389 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:11Z","lastTransitionTime":"2025-12-09T14:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.858208 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.858237 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.858245 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.858258 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.858266 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:11Z","lastTransitionTime":"2025-12-09T14:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.959857 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.959879 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.959887 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.959898 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:11 crc kubenswrapper[4555]: I1209 14:59:11.959905 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:11Z","lastTransitionTime":"2025-12-09T14:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.061774 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.061800 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.061807 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.061816 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.061824 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:12Z","lastTransitionTime":"2025-12-09T14:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.160454 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:12 crc kubenswrapper[4555]: E1209 14:59:12.160783 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.161067 4555 scope.go:117] "RemoveContainer" containerID="de4d6b9f459f3e66cc78f2c00c0e4c9a39554ea732d4f2e1d2b85175855ef2bd" Dec 09 14:59:12 crc kubenswrapper[4555]: E1209 14:59:12.161267 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p4bl4_openshift-ovn-kubernetes(c87d128c-39da-4880-9efc-7d2e6319a854)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.162697 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.162719 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.162727 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.162737 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.162745 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:12Z","lastTransitionTime":"2025-12-09T14:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.264700 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.264726 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.264735 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.264746 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.264753 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:12Z","lastTransitionTime":"2025-12-09T14:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.366776 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.366797 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.366805 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.366813 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.366820 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:12Z","lastTransitionTime":"2025-12-09T14:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.468653 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.468692 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.468701 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.468716 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.468724 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:12Z","lastTransitionTime":"2025-12-09T14:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.570343 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.570374 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.570384 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.570395 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.570402 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:12Z","lastTransitionTime":"2025-12-09T14:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.672384 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.672428 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.672438 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.672452 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.672460 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:12Z","lastTransitionTime":"2025-12-09T14:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.773816 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.773846 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.773854 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.773863 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.773871 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:12Z","lastTransitionTime":"2025-12-09T14:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.875780 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.875817 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.875833 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.875847 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.875856 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:12Z","lastTransitionTime":"2025-12-09T14:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.978171 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.978234 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.978243 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.978253 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:12 crc kubenswrapper[4555]: I1209 14:59:12.978260 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:12Z","lastTransitionTime":"2025-12-09T14:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.080274 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.080320 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.080331 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.080342 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.080350 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:13Z","lastTransitionTime":"2025-12-09T14:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.160422 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.160497 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:13 crc kubenswrapper[4555]: E1209 14:59:13.160591 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.160593 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:13 crc kubenswrapper[4555]: E1209 14:59:13.160683 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:13 crc kubenswrapper[4555]: E1209 14:59:13.160784 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.181766 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.182033 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.182133 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.182200 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.182785 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:13Z","lastTransitionTime":"2025-12-09T14:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.284356 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.284447 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.284512 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.284577 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.284636 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:13Z","lastTransitionTime":"2025-12-09T14:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.386633 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.386664 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.386673 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.386685 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.386694 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:13Z","lastTransitionTime":"2025-12-09T14:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.488321 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.488353 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.488361 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.488375 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.488383 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:13Z","lastTransitionTime":"2025-12-09T14:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.590492 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.590520 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.590528 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.590541 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.590549 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:13Z","lastTransitionTime":"2025-12-09T14:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.692507 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.692533 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.692540 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.692552 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.692560 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:13Z","lastTransitionTime":"2025-12-09T14:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.794210 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.794484 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.794493 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.794507 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.794516 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:13Z","lastTransitionTime":"2025-12-09T14:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.896544 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.896567 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.896575 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.896585 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.896593 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:13Z","lastTransitionTime":"2025-12-09T14:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.998859 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.998894 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.998903 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.998916 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:13 crc kubenswrapper[4555]: I1209 14:59:13.998934 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:13Z","lastTransitionTime":"2025-12-09T14:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.100850 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.100953 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.101024 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.101091 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.101149 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:14Z","lastTransitionTime":"2025-12-09T14:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.160681 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:14 crc kubenswrapper[4555]: E1209 14:59:14.160781 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.202241 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.202270 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.202277 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.202287 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.202295 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:14Z","lastTransitionTime":"2025-12-09T14:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.303605 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.303629 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.303637 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.303646 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.303653 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:14Z","lastTransitionTime":"2025-12-09T14:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.405077 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.405098 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.405106 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.405116 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.405124 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:14Z","lastTransitionTime":"2025-12-09T14:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.506807 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.506833 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.506841 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.506852 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.506862 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:14Z","lastTransitionTime":"2025-12-09T14:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.608759 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.608785 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.608793 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.608803 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.608810 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:14Z","lastTransitionTime":"2025-12-09T14:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.709965 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.709997 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.710007 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.710022 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.710030 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:14Z","lastTransitionTime":"2025-12-09T14:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.811610 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.811640 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.811648 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.811657 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.811665 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:14Z","lastTransitionTime":"2025-12-09T14:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.913174 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.913208 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.913216 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.913231 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:14 crc kubenswrapper[4555]: I1209 14:59:14.913240 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:14Z","lastTransitionTime":"2025-12-09T14:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.014751 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.014792 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.014847 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.014861 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.014870 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:15Z","lastTransitionTime":"2025-12-09T14:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.116519 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.116566 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.116576 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.116591 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.116603 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:15Z","lastTransitionTime":"2025-12-09T14:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.160809 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.160856 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.160859 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:15 crc kubenswrapper[4555]: E1209 14:59:15.160901 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:15 crc kubenswrapper[4555]: E1209 14:59:15.160956 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:15 crc kubenswrapper[4555]: E1209 14:59:15.161006 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.218049 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.218077 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.218084 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.218094 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.218102 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:15Z","lastTransitionTime":"2025-12-09T14:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.322513 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.322556 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.322567 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.322580 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.322589 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:15Z","lastTransitionTime":"2025-12-09T14:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.424709 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.424757 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.424766 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.424779 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.424788 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:15Z","lastTransitionTime":"2025-12-09T14:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.526323 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.526355 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.526365 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.526379 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.526390 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:15Z","lastTransitionTime":"2025-12-09T14:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.627808 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.627843 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.627853 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.627864 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.627872 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:15Z","lastTransitionTime":"2025-12-09T14:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.729207 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.729233 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.729243 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.729255 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.729263 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:15Z","lastTransitionTime":"2025-12-09T14:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.830437 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.830463 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.830470 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.830482 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.830489 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:15Z","lastTransitionTime":"2025-12-09T14:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.932577 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.932605 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.932614 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.932627 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:15 crc kubenswrapper[4555]: I1209 14:59:15.932635 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:15Z","lastTransitionTime":"2025-12-09T14:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.034461 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.034490 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.034519 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.034529 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.034536 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:16Z","lastTransitionTime":"2025-12-09T14:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.135767 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.135794 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.135822 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.135834 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.135841 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:16Z","lastTransitionTime":"2025-12-09T14:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.160755 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:16 crc kubenswrapper[4555]: E1209 14:59:16.160834 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.237835 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.237865 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.237873 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.237882 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.237890 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:16Z","lastTransitionTime":"2025-12-09T14:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.338948 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.338968 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.338977 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.338986 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.338992 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:16Z","lastTransitionTime":"2025-12-09T14:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.442160 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.442192 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.442201 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.442215 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.442223 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:16Z","lastTransitionTime":"2025-12-09T14:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.543889 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.543940 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.543951 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.543961 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.543967 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:16Z","lastTransitionTime":"2025-12-09T14:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.645153 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.645178 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.645188 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.645226 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.645234 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:16Z","lastTransitionTime":"2025-12-09T14:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.746717 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.746865 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.746946 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.747025 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.747082 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:16Z","lastTransitionTime":"2025-12-09T14:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.849368 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.849404 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.849412 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.849426 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.849435 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:16Z","lastTransitionTime":"2025-12-09T14:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.951039 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.951092 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.951102 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.951115 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:16 crc kubenswrapper[4555]: I1209 14:59:16.951124 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:16Z","lastTransitionTime":"2025-12-09T14:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.052750 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.052805 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.052816 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.052830 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.052839 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:17Z","lastTransitionTime":"2025-12-09T14:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.154149 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.154174 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.154181 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.154191 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.154198 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:17Z","lastTransitionTime":"2025-12-09T14:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.160564 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:17 crc kubenswrapper[4555]: E1209 14:59:17.160727 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.160584 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:17 crc kubenswrapper[4555]: E1209 14:59:17.160904 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.160584 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:17 crc kubenswrapper[4555]: E1209 14:59:17.161055 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.255348 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.255378 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.255389 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.255400 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.255408 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:17Z","lastTransitionTime":"2025-12-09T14:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.357009 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.357033 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.357059 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.357069 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.357076 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:17Z","lastTransitionTime":"2025-12-09T14:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.458866 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.458902 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.458926 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.458938 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.458947 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:17Z","lastTransitionTime":"2025-12-09T14:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.560016 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.560051 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.560060 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.560074 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.560083 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:17Z","lastTransitionTime":"2025-12-09T14:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.662016 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.662046 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.662054 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.662065 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.662074 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:17Z","lastTransitionTime":"2025-12-09T14:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.763807 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.763842 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.763851 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.763863 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.763872 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:17Z","lastTransitionTime":"2025-12-09T14:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.865391 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.865425 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.865434 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.865449 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.865457 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:17Z","lastTransitionTime":"2025-12-09T14:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.967042 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.967075 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.967085 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.967097 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:17 crc kubenswrapper[4555]: I1209 14:59:17.967106 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:17Z","lastTransitionTime":"2025-12-09T14:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.069122 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.069150 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.069158 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.069169 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.069178 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:18Z","lastTransitionTime":"2025-12-09T14:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.160449 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:18 crc kubenswrapper[4555]: E1209 14:59:18.160565 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.170959 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.171016 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.171026 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.171036 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.171043 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:18Z","lastTransitionTime":"2025-12-09T14:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.266945 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/35f7e7bd-b667-42fa-b040-4a186e4570e3-metrics-certs\") pod \"network-metrics-daemon-6sc2q\" (UID: \"35f7e7bd-b667-42fa-b040-4a186e4570e3\") " pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:18 crc kubenswrapper[4555]: E1209 14:59:18.267045 4555 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:59:18 crc kubenswrapper[4555]: E1209 14:59:18.267109 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35f7e7bd-b667-42fa-b040-4a186e4570e3-metrics-certs podName:35f7e7bd-b667-42fa-b040-4a186e4570e3 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:50.267094906 +0000 UTC m=+100.403807624 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/35f7e7bd-b667-42fa-b040-4a186e4570e3-metrics-certs") pod "network-metrics-daemon-6sc2q" (UID: "35f7e7bd-b667-42fa-b040-4a186e4570e3") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.272317 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.272339 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.272346 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.272359 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.272367 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:18Z","lastTransitionTime":"2025-12-09T14:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.374559 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.374586 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.374594 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.374609 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.374617 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:18Z","lastTransitionTime":"2025-12-09T14:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.476005 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.476025 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.476033 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.476044 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.476051 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:18Z","lastTransitionTime":"2025-12-09T14:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.577211 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.577233 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.577241 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.577251 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.577258 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:18Z","lastTransitionTime":"2025-12-09T14:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.678457 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.678487 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.678498 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.678511 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.678519 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:18Z","lastTransitionTime":"2025-12-09T14:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.779830 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.779853 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.779861 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.779872 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.779888 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:18Z","lastTransitionTime":"2025-12-09T14:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.883531 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.883579 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.883588 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.883602 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.883611 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:18Z","lastTransitionTime":"2025-12-09T14:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.986453 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.986499 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.986509 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.986522 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:18 crc kubenswrapper[4555]: I1209 14:59:18.986532 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:18Z","lastTransitionTime":"2025-12-09T14:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.088352 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.088399 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.088414 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.088424 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.088432 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:19Z","lastTransitionTime":"2025-12-09T14:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.161012 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.161025 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:19 crc kubenswrapper[4555]: E1209 14:59:19.161089 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.161044 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:19 crc kubenswrapper[4555]: E1209 14:59:19.161174 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:19 crc kubenswrapper[4555]: E1209 14:59:19.161199 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.190737 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.190776 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.190787 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.190801 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.190811 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:19Z","lastTransitionTime":"2025-12-09T14:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.293079 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.293111 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.293122 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.293136 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.293146 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:19Z","lastTransitionTime":"2025-12-09T14:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.392113 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gjjcm_3e34542c-29e8-43d3-add3-42aa080e0c8a/kube-multus/0.log" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.392153 4555 generic.go:334] "Generic (PLEG): container finished" podID="3e34542c-29e8-43d3-add3-42aa080e0c8a" containerID="b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656" exitCode=1 Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.392173 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gjjcm" event={"ID":"3e34542c-29e8-43d3-add3-42aa080e0c8a","Type":"ContainerDied","Data":"b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656"} Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.392425 4555 scope.go:117] "RemoveContainer" containerID="b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.394206 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.394257 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.394269 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.394314 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.394324 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:19Z","lastTransitionTime":"2025-12-09T14:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.402271 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:19Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.410269 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:19Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.424326 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:19Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.439535 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sc2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f7e7bd-b667-42fa-b040-4a186e4570e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sc2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:19Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.462939 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:19Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.473727 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffcfa1a3-f24c-4809-8468-636ab9b1a063\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c1f863d399da019632ceb8d698cf8aa186d0f533ff8a49da7c66dd11086253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa09f69008b086bb69e768437b40d896803a11f82f6949202ed0a456f3a213be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b44fa2c57434247fea11c1e1fbed34ad1df532e7e6bba0623634541322627f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a30d185f354ca39197fe30559783400a7d218ef0a162fd5fa5e5039d1946d127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30d185f354ca39197fe30559783400a7d218ef0a162fd5fa5e5039d1946d127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:19Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.482643 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:19Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.491763 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:19Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.496616 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.496645 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.496653 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.496666 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.496674 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:19Z","lastTransitionTime":"2025-12-09T14:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.499500 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:19Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.507797 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:18Z\\\",\\\"message\\\":\\\"2025-12-09T14:58:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_43b55716-79db-4d3f-93e7-dbc41035fea1\\\\n2025-12-09T14:58:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_43b55716-79db-4d3f-93e7-dbc41035fea1 to /host/opt/cni/bin/\\\\n2025-12-09T14:58:33Z [verbose] multus-daemon started\\\\n2025-12-09T14:58:33Z [verbose] Readiness Indicator file check\\\\n2025-12-09T14:59:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:19Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.519756 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4d6b9f459f3e66cc78f2c00c0e4c9a39554ea732d4f2e1d2b85175855ef2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de4d6b9f459f3e66cc78f2c00c0e4c9a39554ea732d4f2e1d2b85175855ef2bd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:58:57Z\\\",\\\"message\\\":\\\"ice k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:58:57.738539 6180 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/api]} name:Service_openshift-apiserver/api_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.37:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {88e20c31-5b8d-4d44-bbd8-dba87b7dbaf0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:58:57.738561 6180 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p4bl4_openshift-ovn-kubernetes(c87d128c-39da-4880-9efc-7d2e6319a854)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:19Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.526682 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-72msc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4f13f82-a44f-461d-93b0-916a7ed0b925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06c787dc46c305e7388f67f85601eb3e724c501d4da0504d4f8fd569f533c634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2spqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-72msc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:19Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.534709 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:19Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.542853 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:19Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.551660 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:19Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.560814 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ae3cc97b46fa36f9bd6f439124d81700af58cb44671d46ffa2936260f6dda4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:19Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.567667 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae951ddc-0e90-497c-af85-8e39088f32d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59535f0584581ae9d42fa3c1d9f4f204c8a2c7c96d211a2b8d74806eb3901a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a5e10f192fc24405b980ce611a09f1e0c6d49c4c94df89bceef151a747f1aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qchk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:19Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.574035 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:19Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.598764 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.598794 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.598803 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.598817 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.598825 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:19Z","lastTransitionTime":"2025-12-09T14:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.700719 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.700740 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.700748 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.700759 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.700766 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:19Z","lastTransitionTime":"2025-12-09T14:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.802113 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.802168 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.802177 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.802187 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.802196 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:19Z","lastTransitionTime":"2025-12-09T14:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.903960 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.904010 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.904019 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.904032 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:19 crc kubenswrapper[4555]: I1209 14:59:19.904041 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:19Z","lastTransitionTime":"2025-12-09T14:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.005561 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.005608 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.005618 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.005631 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.005639 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:20Z","lastTransitionTime":"2025-12-09T14:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.107433 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.107480 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.107490 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.107500 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.107508 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:20Z","lastTransitionTime":"2025-12-09T14:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.161081 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:20 crc kubenswrapper[4555]: E1209 14:59:20.161221 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.168571 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.175319 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sc2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f7e7bd-b667-42fa-b040-4a186e4570e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sc2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.184034 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.192355 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.199862 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.207570 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.208632 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.208668 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.208678 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.208690 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.208698 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:20Z","lastTransitionTime":"2025-12-09T14:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.216258 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.224399 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:18Z\\\",\\\"message\\\":\\\"2025-12-09T14:58:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_43b55716-79db-4d3f-93e7-dbc41035fea1\\\\n2025-12-09T14:58:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_43b55716-79db-4d3f-93e7-dbc41035fea1 to /host/opt/cni/bin/\\\\n2025-12-09T14:58:33Z [verbose] multus-daemon started\\\\n2025-12-09T14:58:33Z [verbose] Readiness Indicator file check\\\\n2025-12-09T14:59:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.236256 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4d6b9f459f3e66cc78f2c00c0e4c9a39554ea732d4f2e1d2b85175855ef2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de4d6b9f459f3e66cc78f2c00c0e4c9a39554ea732d4f2e1d2b85175855ef2bd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:58:57Z\\\",\\\"message\\\":\\\"ice k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:58:57.738539 6180 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/api]} name:Service_openshift-apiserver/api_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.37:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {88e20c31-5b8d-4d44-bbd8-dba87b7dbaf0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:58:57.738561 6180 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p4bl4_openshift-ovn-kubernetes(c87d128c-39da-4880-9efc-7d2e6319a854)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.242917 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-72msc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4f13f82-a44f-461d-93b0-916a7ed0b925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06c787dc46c305e7388f67f85601eb3e724c501d4da0504d4f8fd569f533c634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2spqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-72msc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.255665 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.262886 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffcfa1a3-f24c-4809-8468-636ab9b1a063\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c1f863d399da019632ceb8d698cf8aa186d0f533ff8a49da7c66dd11086253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa09f69008b086bb69e768437b40d896803a11f82f6949202ed0a456f3a213be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b44fa2c57434247fea11c1e1fbed34ad1df532e7e6bba0623634541322627f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a30d185f354ca39197fe30559783400a7d218ef0a162fd5fa5e5039d1946d127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30d185f354ca39197fe30559783400a7d218ef0a162fd5fa5e5039d1946d127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.270803 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.277493 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.287693 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ae3cc97b46fa36f9bd6f439124d81700af58cb44671d46ffa2936260f6dda4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.297523 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae951ddc-0e90-497c-af85-8e39088f32d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59535f0584581ae9d42fa3c1d9f4f204c8a2c7c96d211a2b8d74806eb3901a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a5e10f192fc24405b980ce611a09f1e0c6d49c4c94df89bceef151a747f1aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qchk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.305282 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.310193 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.310230 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.310239 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.310252 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.310260 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:20Z","lastTransitionTime":"2025-12-09T14:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.313421 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.395248 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gjjcm_3e34542c-29e8-43d3-add3-42aa080e0c8a/kube-multus/0.log" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.395315 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gjjcm" event={"ID":"3e34542c-29e8-43d3-add3-42aa080e0c8a","Type":"ContainerStarted","Data":"73a16b54312a5c81ee3d5b4753cd53c9a01cc7f174b340b1f0db44b019541fbd"} Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.403155 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.411406 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.412406 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.412513 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.412524 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.412557 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.412566 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:20Z","lastTransitionTime":"2025-12-09T14:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.418593 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.427546 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ae3cc97b46fa36f9bd6f439124d81700af58cb44671d46ffa2936260f6dda4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.434468 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae951ddc-0e90-497c-af85-8e39088f32d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59535f0584581ae9d42fa3c1d9f4f204c8a2c7c96d211a2b8d74806eb3901a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a5e10f192fc24405b980ce611a09f1e0c6d49c4c94df89bceef151a747f1aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qchk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.440938 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.449719 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.457607 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.465292 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.471786 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sc2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f7e7bd-b667-42fa-b040-4a186e4570e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sc2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.484279 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.491733 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffcfa1a3-f24c-4809-8468-636ab9b1a063\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c1f863d399da019632ceb8d698cf8aa186d0f533ff8a49da7c66dd11086253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa09f69008b086bb69e768437b40d896803a11f82f6949202ed0a456f3a213be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b44fa2c57434247fea11c1e1fbed34ad1df532e7e6bba0623634541322627f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a30d185f354ca39197fe30559783400a7d218ef0a162fd5fa5e5039d1946d127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30d185f354ca39197fe30559783400a7d218ef0a162fd5fa5e5039d1946d127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.499647 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.507328 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.514542 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.514566 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.514574 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.514586 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.514596 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:20Z","lastTransitionTime":"2025-12-09T14:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.514834 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.523425 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73a16b54312a5c81ee3d5b4753cd53c9a01cc7f174b340b1f0db44b019541fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:18Z\\\",\\\"message\\\":\\\"2025-12-09T14:58:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_43b55716-79db-4d3f-93e7-dbc41035fea1\\\\n2025-12-09T14:58:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_43b55716-79db-4d3f-93e7-dbc41035fea1 to /host/opt/cni/bin/\\\\n2025-12-09T14:58:33Z [verbose] multus-daemon started\\\\n2025-12-09T14:58:33Z [verbose] Readiness Indicator file check\\\\n2025-12-09T14:59:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.535598 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4d6b9f459f3e66cc78f2c00c0e4c9a39554ea732d4f2e1d2b85175855ef2bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de4d6b9f459f3e66cc78f2c00c0e4c9a39554ea732d4f2e1d2b85175855ef2bd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:58:57Z\\\",\\\"message\\\":\\\"ice k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:58:57.738539 6180 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/api]} name:Service_openshift-apiserver/api_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.37:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {88e20c31-5b8d-4d44-bbd8-dba87b7dbaf0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:58:57.738561 6180 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p4bl4_openshift-ovn-kubernetes(c87d128c-39da-4880-9efc-7d2e6319a854)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.541808 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-72msc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4f13f82-a44f-461d-93b0-916a7ed0b925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06c787dc46c305e7388f67f85601eb3e724c501d4da0504d4f8fd569f533c634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2spqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-72msc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:20Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.616687 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.616716 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.616725 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.616753 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.616762 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:20Z","lastTransitionTime":"2025-12-09T14:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.720628 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.720668 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.720676 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.720687 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.720695 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:20Z","lastTransitionTime":"2025-12-09T14:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.822654 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.822747 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.822821 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.822904 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.822971 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:20Z","lastTransitionTime":"2025-12-09T14:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.925217 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.925570 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.925636 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.925711 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:20 crc kubenswrapper[4555]: I1209 14:59:20.925767 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:20Z","lastTransitionTime":"2025-12-09T14:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.027850 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.027892 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.027901 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.027914 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.027921 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:21Z","lastTransitionTime":"2025-12-09T14:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.111967 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.112000 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.112009 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.112022 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.112031 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:21Z","lastTransitionTime":"2025-12-09T14:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:21 crc kubenswrapper[4555]: E1209 14:59:21.120615 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:21Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.124962 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.125350 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.125426 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.125502 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.125558 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:21Z","lastTransitionTime":"2025-12-09T14:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:21 crc kubenswrapper[4555]: E1209 14:59:21.133851 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:21Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.136263 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.136293 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.136316 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.136329 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.136337 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:21Z","lastTransitionTime":"2025-12-09T14:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:21 crc kubenswrapper[4555]: E1209 14:59:21.144139 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:21Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.146673 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.146700 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.146713 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.146742 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.146751 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:21Z","lastTransitionTime":"2025-12-09T14:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:21 crc kubenswrapper[4555]: E1209 14:59:21.154710 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:21Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.157132 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.157161 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.157173 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.157185 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.157193 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:21Z","lastTransitionTime":"2025-12-09T14:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.160825 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.160842 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.160858 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:21 crc kubenswrapper[4555]: E1209 14:59:21.160921 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:21 crc kubenswrapper[4555]: E1209 14:59:21.160974 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:21 crc kubenswrapper[4555]: E1209 14:59:21.161042 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:21 crc kubenswrapper[4555]: E1209 14:59:21.164913 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:21Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:21 crc kubenswrapper[4555]: E1209 14:59:21.165010 4555 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.165840 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.165873 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.165901 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.165910 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.165918 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:21Z","lastTransitionTime":"2025-12-09T14:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.267563 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.267582 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.267589 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.267598 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.267606 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:21Z","lastTransitionTime":"2025-12-09T14:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.368671 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.368919 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.368980 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.369044 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.369101 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:21Z","lastTransitionTime":"2025-12-09T14:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.470888 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.470921 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.470930 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.470943 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.470953 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:21Z","lastTransitionTime":"2025-12-09T14:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.572365 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.572395 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.572404 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.572415 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.572422 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:21Z","lastTransitionTime":"2025-12-09T14:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.673801 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.673837 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.673848 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.673873 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.673884 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:21Z","lastTransitionTime":"2025-12-09T14:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.775421 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.775518 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.775578 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.775654 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.775715 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:21Z","lastTransitionTime":"2025-12-09T14:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.877133 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.877163 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.877172 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.877182 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.877191 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:21Z","lastTransitionTime":"2025-12-09T14:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.978899 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.978931 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.978939 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.978951 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:21 crc kubenswrapper[4555]: I1209 14:59:21.978959 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:21Z","lastTransitionTime":"2025-12-09T14:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.080735 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.080757 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.080766 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.080776 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.080783 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:22Z","lastTransitionTime":"2025-12-09T14:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.160090 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:22 crc kubenswrapper[4555]: E1209 14:59:22.160200 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.167255 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.182376 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.182419 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.182428 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.182437 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.182447 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:22Z","lastTransitionTime":"2025-12-09T14:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.283577 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.283608 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.283616 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.283625 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.283634 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:22Z","lastTransitionTime":"2025-12-09T14:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.384806 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.384831 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.384839 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.384865 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.384875 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:22Z","lastTransitionTime":"2025-12-09T14:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.486942 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.486973 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.486982 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.486991 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.486999 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:22Z","lastTransitionTime":"2025-12-09T14:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.588988 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.589103 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.589175 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.589240 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.589322 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:22Z","lastTransitionTime":"2025-12-09T14:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.690722 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.690827 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.690927 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.691011 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.691074 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:22Z","lastTransitionTime":"2025-12-09T14:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.792369 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.792467 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.792525 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.792598 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.792665 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:22Z","lastTransitionTime":"2025-12-09T14:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.893921 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.893949 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.893957 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.893967 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.893974 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:22Z","lastTransitionTime":"2025-12-09T14:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.996093 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.996129 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.996138 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.996150 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:22 crc kubenswrapper[4555]: I1209 14:59:22.996159 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:22Z","lastTransitionTime":"2025-12-09T14:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.098029 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.098101 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.098118 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.098135 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.098147 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:23Z","lastTransitionTime":"2025-12-09T14:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.160732 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:23 crc kubenswrapper[4555]: E1209 14:59:23.160821 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.160830 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:23 crc kubenswrapper[4555]: E1209 14:59:23.160920 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.160958 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:23 crc kubenswrapper[4555]: E1209 14:59:23.161003 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.161424 4555 scope.go:117] "RemoveContainer" containerID="de4d6b9f459f3e66cc78f2c00c0e4c9a39554ea732d4f2e1d2b85175855ef2bd" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.200244 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.200269 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.200277 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.200288 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.200297 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:23Z","lastTransitionTime":"2025-12-09T14:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.302335 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.302357 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.302365 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.302377 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.302385 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:23Z","lastTransitionTime":"2025-12-09T14:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.403944 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p4bl4_c87d128c-39da-4880-9efc-7d2e6319a854/ovnkube-controller/2.log" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.404046 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.404061 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.404068 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.404081 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.404089 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:23Z","lastTransitionTime":"2025-12-09T14:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.405825 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" event={"ID":"c87d128c-39da-4880-9efc-7d2e6319a854","Type":"ContainerStarted","Data":"f2496bb1d7c5b48f9944b2026978276ba9cf676ecd42dff619698a1ab8776269"} Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.406645 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.415584 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a4e758-0a2b-45c2-9314-85caacad3889\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6504dd2b92e68c220a759270fa99ddc38e403112a4c10b4eaf5d15539769c3fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7803e784d08586ae7bb48c47ef9e647dd0e53d5eb079ac8f4cfd1b74bcd15564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7803e784d08586ae7bb48c47ef9e647dd0e53d5eb079ac8f4cfd1b74bcd15564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.425087 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.434530 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.444061 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.453323 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ae3cc97b46fa36f9bd6f439124d81700af58cb44671d46ffa2936260f6dda4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.462190 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae951ddc-0e90-497c-af85-8e39088f32d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59535f0584581ae9d42fa3c1d9f4f204c8a2c7c96d211a2b8d74806eb3901a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a5e10f192fc24405b980ce611a09f1e0c6d49c4c94df89bceef151a747f1aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qchk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.469409 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.478228 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.486148 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.494378 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.501285 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sc2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f7e7bd-b667-42fa-b040-4a186e4570e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sc2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.505755 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.505789 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.505798 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.505811 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.505821 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:23Z","lastTransitionTime":"2025-12-09T14:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.508338 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-72msc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4f13f82-a44f-461d-93b0-916a7ed0b925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06c787dc46c305e7388f67f85601eb3e724c501d4da0504d4f8fd569f533c634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2spqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-72msc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.521087 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.528551 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffcfa1a3-f24c-4809-8468-636ab9b1a063\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c1f863d399da019632ceb8d698cf8aa186d0f533ff8a49da7c66dd11086253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa09f69008b086bb69e768437b40d896803a11f82f6949202ed0a456f3a213be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b44fa2c57434247fea11c1e1fbed34ad1df532e7e6bba0623634541322627f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a30d185f354ca39197fe30559783400a7d218ef0a162fd5fa5e5039d1946d127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30d185f354ca39197fe30559783400a7d218ef0a162fd5fa5e5039d1946d127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.536871 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.544861 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.552635 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.560967 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73a16b54312a5c81ee3d5b4753cd53c9a01cc7f174b340b1f0db44b019541fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:18Z\\\",\\\"message\\\":\\\"2025-12-09T14:58:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_43b55716-79db-4d3f-93e7-dbc41035fea1\\\\n2025-12-09T14:58:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_43b55716-79db-4d3f-93e7-dbc41035fea1 to /host/opt/cni/bin/\\\\n2025-12-09T14:58:33Z [verbose] multus-daemon started\\\\n2025-12-09T14:58:33Z [verbose] Readiness Indicator file check\\\\n2025-12-09T14:59:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.573369 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2496bb1d7c5b48f9944b2026978276ba9cf676ecd42dff619698a1ab8776269\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de4d6b9f459f3e66cc78f2c00c0e4c9a39554ea732d4f2e1d2b85175855ef2bd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:58:57Z\\\",\\\"message\\\":\\\"ice k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:58:57.738539 6180 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/api]} name:Service_openshift-apiserver/api_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.37:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {88e20c31-5b8d-4d44-bbd8-dba87b7dbaf0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:58:57.738561 6180 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.607708 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.607740 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.607748 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.607759 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.607768 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:23Z","lastTransitionTime":"2025-12-09T14:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.709844 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.709888 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.709896 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.709908 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.709915 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:23Z","lastTransitionTime":"2025-12-09T14:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.811754 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.811790 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.811799 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.811812 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.811820 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:23Z","lastTransitionTime":"2025-12-09T14:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.913205 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.913245 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.913255 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.913269 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:23 crc kubenswrapper[4555]: I1209 14:59:23.913280 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:23Z","lastTransitionTime":"2025-12-09T14:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.014868 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.014892 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.014900 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.014912 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.014920 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:24Z","lastTransitionTime":"2025-12-09T14:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.116951 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.116982 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.116991 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.117000 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.117008 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:24Z","lastTransitionTime":"2025-12-09T14:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.160508 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:24 crc kubenswrapper[4555]: E1209 14:59:24.160633 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.218395 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.218423 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.218432 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.218445 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.218456 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:24Z","lastTransitionTime":"2025-12-09T14:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.320034 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.320055 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.320063 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.320073 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.320082 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:24Z","lastTransitionTime":"2025-12-09T14:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.408723 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p4bl4_c87d128c-39da-4880-9efc-7d2e6319a854/ovnkube-controller/3.log" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.409217 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p4bl4_c87d128c-39da-4880-9efc-7d2e6319a854/ovnkube-controller/2.log" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.411225 4555 generic.go:334] "Generic (PLEG): container finished" podID="c87d128c-39da-4880-9efc-7d2e6319a854" containerID="f2496bb1d7c5b48f9944b2026978276ba9cf676ecd42dff619698a1ab8776269" exitCode=1 Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.411255 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" event={"ID":"c87d128c-39da-4880-9efc-7d2e6319a854","Type":"ContainerDied","Data":"f2496bb1d7c5b48f9944b2026978276ba9cf676ecd42dff619698a1ab8776269"} Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.411280 4555 scope.go:117] "RemoveContainer" containerID="de4d6b9f459f3e66cc78f2c00c0e4c9a39554ea732d4f2e1d2b85175855ef2bd" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.411738 4555 scope.go:117] "RemoveContainer" containerID="f2496bb1d7c5b48f9944b2026978276ba9cf676ecd42dff619698a1ab8776269" Dec 09 14:59:24 crc kubenswrapper[4555]: E1209 14:59:24.411865 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p4bl4_openshift-ovn-kubernetes(c87d128c-39da-4880-9efc-7d2e6319a854)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.421457 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.421477 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.421486 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.421497 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.421505 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:24Z","lastTransitionTime":"2025-12-09T14:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.422739 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:24Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.430777 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:24Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.443064 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:24Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.451029 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73a16b54312a5c81ee3d5b4753cd53c9a01cc7f174b340b1f0db44b019541fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:18Z\\\",\\\"message\\\":\\\"2025-12-09T14:58:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_43b55716-79db-4d3f-93e7-dbc41035fea1\\\\n2025-12-09T14:58:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_43b55716-79db-4d3f-93e7-dbc41035fea1 to /host/opt/cni/bin/\\\\n2025-12-09T14:58:33Z [verbose] multus-daemon started\\\\n2025-12-09T14:58:33Z [verbose] Readiness Indicator file check\\\\n2025-12-09T14:59:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:24Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.462552 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2496bb1d7c5b48f9944b2026978276ba9cf676ecd42dff619698a1ab8776269\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de4d6b9f459f3e66cc78f2c00c0e4c9a39554ea732d4f2e1d2b85175855ef2bd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:58:57Z\\\",\\\"message\\\":\\\"ice k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:58:57.738539 6180 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/api]} name:Service_openshift-apiserver/api_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.37:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {88e20c31-5b8d-4d44-bbd8-dba87b7dbaf0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:58:57.738561 6180 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2496bb1d7c5b48f9944b2026978276ba9cf676ecd42dff619698a1ab8776269\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:23Z\\\",\\\"message\\\":\\\"router-internal-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:1936, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1209 14:59:23.772112 6583 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}\\\\nF1209 14:59:23.772066 6583 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default nod\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:24Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.468483 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-72msc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4f13f82-a44f-461d-93b0-916a7ed0b925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06c787dc46c305e7388f67f85601eb3e724c501d4da0504d4f8fd569f533c634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2spqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-72msc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:24Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.480784 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:24Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.487733 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffcfa1a3-f24c-4809-8468-636ab9b1a063\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c1f863d399da019632ceb8d698cf8aa186d0f533ff8a49da7c66dd11086253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa09f69008b086bb69e768437b40d896803a11f82f6949202ed0a456f3a213be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b44fa2c57434247fea11c1e1fbed34ad1df532e7e6bba0623634541322627f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a30d185f354ca39197fe30559783400a7d218ef0a162fd5fa5e5039d1946d127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30d185f354ca39197fe30559783400a7d218ef0a162fd5fa5e5039d1946d127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:24Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.495350 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:24Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.501925 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:24Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.511159 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ae3cc97b46fa36f9bd6f439124d81700af58cb44671d46ffa2936260f6dda4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:24Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.517888 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae951ddc-0e90-497c-af85-8e39088f32d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59535f0584581ae9d42fa3c1d9f4f204c8a2c7c96d211a2b8d74806eb3901a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a5e10f192fc24405b980ce611a09f1e0c6d49c4c94df89bceef151a747f1aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qchk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:24Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.523035 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.523064 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.523073 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.523086 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.523095 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:24Z","lastTransitionTime":"2025-12-09T14:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.524459 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a4e758-0a2b-45c2-9314-85caacad3889\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6504dd2b92e68c220a759270fa99ddc38e403112a4c10b4eaf5d15539769c3fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7803e784d08586ae7bb48c47ef9e647dd0e53d5eb079ac8f4cfd1b74bcd15564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7803e784d08586ae7bb48c47ef9e647dd0e53d5eb079ac8f4cfd1b74bcd15564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:24Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.532456 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:24Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.538785 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:24Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.546988 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:24Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.553962 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sc2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f7e7bd-b667-42fa-b040-4a186e4570e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sc2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:24Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.562611 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:24Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.570152 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:24Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.624606 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.624710 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.624791 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.624858 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.624975 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:24Z","lastTransitionTime":"2025-12-09T14:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.727190 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.727212 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.727221 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.727230 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.727238 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:24Z","lastTransitionTime":"2025-12-09T14:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.828611 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.828632 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.828640 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.828649 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.828656 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:24Z","lastTransitionTime":"2025-12-09T14:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.930535 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.930563 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.930571 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.930585 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:24 crc kubenswrapper[4555]: I1209 14:59:24.930594 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:24Z","lastTransitionTime":"2025-12-09T14:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.032662 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.032684 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.032692 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.032702 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.032711 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:25Z","lastTransitionTime":"2025-12-09T14:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.134033 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.134063 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.134071 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.134082 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.134089 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:25Z","lastTransitionTime":"2025-12-09T14:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.160092 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.160113 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.160140 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:25 crc kubenswrapper[4555]: E1209 14:59:25.160185 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:25 crc kubenswrapper[4555]: E1209 14:59:25.160253 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:25 crc kubenswrapper[4555]: E1209 14:59:25.160338 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.236059 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.236084 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.236094 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.236104 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.236113 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:25Z","lastTransitionTime":"2025-12-09T14:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.337544 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.337655 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.337734 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.337806 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.337873 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:25Z","lastTransitionTime":"2025-12-09T14:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.414857 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p4bl4_c87d128c-39da-4880-9efc-7d2e6319a854/ovnkube-controller/3.log" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.419925 4555 scope.go:117] "RemoveContainer" containerID="f2496bb1d7c5b48f9944b2026978276ba9cf676ecd42dff619698a1ab8776269" Dec 09 14:59:25 crc kubenswrapper[4555]: E1209 14:59:25.420109 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p4bl4_openshift-ovn-kubernetes(c87d128c-39da-4880-9efc-7d2e6319a854)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.429159 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73a16b54312a5c81ee3d5b4753cd53c9a01cc7f174b340b1f0db44b019541fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:18Z\\\",\\\"message\\\":\\\"2025-12-09T14:58:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_43b55716-79db-4d3f-93e7-dbc41035fea1\\\\n2025-12-09T14:58:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_43b55716-79db-4d3f-93e7-dbc41035fea1 to /host/opt/cni/bin/\\\\n2025-12-09T14:58:33Z [verbose] multus-daemon started\\\\n2025-12-09T14:58:33Z [verbose] Readiness Indicator file check\\\\n2025-12-09T14:59:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:25Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.438919 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.438946 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.438954 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.438966 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.438974 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:25Z","lastTransitionTime":"2025-12-09T14:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.442162 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2496bb1d7c5b48f9944b2026978276ba9cf676ecd42dff619698a1ab8776269\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2496bb1d7c5b48f9944b2026978276ba9cf676ecd42dff619698a1ab8776269\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:23Z\\\",\\\"message\\\":\\\"router-internal-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:1936, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1209 14:59:23.772112 6583 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}\\\\nF1209 14:59:23.772066 6583 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default nod\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:23Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p4bl4_openshift-ovn-kubernetes(c87d128c-39da-4880-9efc-7d2e6319a854)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:25Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.448415 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-72msc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4f13f82-a44f-461d-93b0-916a7ed0b925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06c787dc46c305e7388f67f85601eb3e724c501d4da0504d4f8fd569f533c634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2spqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-72msc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:25Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.460896 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:25Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.468371 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffcfa1a3-f24c-4809-8468-636ab9b1a063\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c1f863d399da019632ceb8d698cf8aa186d0f533ff8a49da7c66dd11086253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa09f69008b086bb69e768437b40d896803a11f82f6949202ed0a456f3a213be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b44fa2c57434247fea11c1e1fbed34ad1df532e7e6bba0623634541322627f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a30d185f354ca39197fe30559783400a7d218ef0a162fd5fa5e5039d1946d127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30d185f354ca39197fe30559783400a7d218ef0a162fd5fa5e5039d1946d127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:25Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.476221 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:25Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.483582 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:25Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.490587 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:25Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.497692 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae951ddc-0e90-497c-af85-8e39088f32d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59535f0584581ae9d42fa3c1d9f4f204c8a2c7c96d211a2b8d74806eb3901a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a5e10f192fc24405b980ce611a09f1e0c6d49c4c94df89bceef151a747f1aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qchk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:25Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.504582 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a4e758-0a2b-45c2-9314-85caacad3889\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6504dd2b92e68c220a759270fa99ddc38e403112a4c10b4eaf5d15539769c3fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7803e784d08586ae7bb48c47ef9e647dd0e53d5eb079ac8f4cfd1b74bcd15564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7803e784d08586ae7bb48c47ef9e647dd0e53d5eb079ac8f4cfd1b74bcd15564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:25Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.512158 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:25Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.520807 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:25Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.527928 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:25Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.537821 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ae3cc97b46fa36f9bd6f439124d81700af58cb44671d46ffa2936260f6dda4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:25Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.540800 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.540829 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.540849 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.540861 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.540869 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:25Z","lastTransitionTime":"2025-12-09T14:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.545070 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:25Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.553528 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:25Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.561679 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:25Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.569353 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:25Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.575693 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sc2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f7e7bd-b667-42fa-b040-4a186e4570e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sc2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:25Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.642685 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.642738 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.642747 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.642759 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.642769 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:25Z","lastTransitionTime":"2025-12-09T14:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.744603 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.744849 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.744938 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.745054 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.745218 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:25Z","lastTransitionTime":"2025-12-09T14:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.847159 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.847264 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.847346 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.847409 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.847460 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:25Z","lastTransitionTime":"2025-12-09T14:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.949659 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.949696 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.949706 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.949717 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:25 crc kubenswrapper[4555]: I1209 14:59:25.949726 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:25Z","lastTransitionTime":"2025-12-09T14:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.051158 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.051187 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.051199 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.051213 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.051224 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:26Z","lastTransitionTime":"2025-12-09T14:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.152889 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.152926 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.152936 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.152948 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.152956 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:26Z","lastTransitionTime":"2025-12-09T14:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.160144 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:26 crc kubenswrapper[4555]: E1209 14:59:26.160243 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.254648 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.254673 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.254683 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.254692 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.254700 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:26Z","lastTransitionTime":"2025-12-09T14:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.356414 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.356447 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.356456 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.356471 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.356479 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:26Z","lastTransitionTime":"2025-12-09T14:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.458410 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.458607 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.458689 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.458758 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.458819 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:26Z","lastTransitionTime":"2025-12-09T14:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.560958 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.560992 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.561003 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.561017 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.561028 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:26Z","lastTransitionTime":"2025-12-09T14:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.662693 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.662723 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.662731 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.662743 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.662751 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:26Z","lastTransitionTime":"2025-12-09T14:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.765036 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.765073 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.765096 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.765110 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.765117 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:26Z","lastTransitionTime":"2025-12-09T14:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.867057 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.867112 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.867124 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.867141 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.867157 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:26Z","lastTransitionTime":"2025-12-09T14:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.969003 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.969037 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.969045 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.969058 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:26 crc kubenswrapper[4555]: I1209 14:59:26.969067 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:26Z","lastTransitionTime":"2025-12-09T14:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.070373 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.070414 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.070422 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.070432 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.070440 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:27Z","lastTransitionTime":"2025-12-09T14:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.160488 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.160555 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:27 crc kubenswrapper[4555]: E1209 14:59:27.160635 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:27 crc kubenswrapper[4555]: E1209 14:59:27.160768 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.160779 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:27 crc kubenswrapper[4555]: E1209 14:59:27.160998 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.171861 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.171910 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.171920 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.171929 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.171936 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:27Z","lastTransitionTime":"2025-12-09T14:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.273265 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.273293 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.273316 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.273326 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.273333 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:27Z","lastTransitionTime":"2025-12-09T14:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.374875 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.374902 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.374910 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.374919 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.374926 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:27Z","lastTransitionTime":"2025-12-09T14:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.476147 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.476178 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.476187 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.476197 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.476205 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:27Z","lastTransitionTime":"2025-12-09T14:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.577251 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.577288 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.577295 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.577328 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.577344 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:27Z","lastTransitionTime":"2025-12-09T14:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.679131 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.679160 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.679168 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.679176 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.679185 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:27Z","lastTransitionTime":"2025-12-09T14:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.780509 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.780540 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.780548 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.780558 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.780567 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:27Z","lastTransitionTime":"2025-12-09T14:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.882450 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.882475 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.882484 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.882494 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.882505 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:27Z","lastTransitionTime":"2025-12-09T14:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.983777 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.983814 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.983834 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.983848 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:27 crc kubenswrapper[4555]: I1209 14:59:27.983857 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:27Z","lastTransitionTime":"2025-12-09T14:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.085921 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.085972 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.085982 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.085995 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.086004 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:28Z","lastTransitionTime":"2025-12-09T14:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.160810 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:28 crc kubenswrapper[4555]: E1209 14:59:28.161106 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.187665 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.187693 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.187703 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.187712 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.187719 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:28Z","lastTransitionTime":"2025-12-09T14:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.289548 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.289580 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.289589 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.289598 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.289624 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:28Z","lastTransitionTime":"2025-12-09T14:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.391138 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.391163 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.391171 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.391181 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.391189 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:28Z","lastTransitionTime":"2025-12-09T14:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.492785 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.492854 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.492867 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.492883 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.492895 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:28Z","lastTransitionTime":"2025-12-09T14:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.594587 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.594617 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.594625 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.594653 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.594663 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:28Z","lastTransitionTime":"2025-12-09T14:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.696706 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.696738 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.696747 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.696759 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.696768 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:28Z","lastTransitionTime":"2025-12-09T14:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.798731 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.798758 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.798767 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.798776 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.798783 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:28Z","lastTransitionTime":"2025-12-09T14:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.900288 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.900327 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.900336 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.900346 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:28 crc kubenswrapper[4555]: I1209 14:59:28.900353 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:28Z","lastTransitionTime":"2025-12-09T14:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.002220 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.002242 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.002249 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.002258 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.002265 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:29Z","lastTransitionTime":"2025-12-09T14:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.104559 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.104602 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.104611 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.104620 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.104628 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:29Z","lastTransitionTime":"2025-12-09T14:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.160470 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.160499 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:29 crc kubenswrapper[4555]: E1209 14:59:29.160562 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.160476 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:29 crc kubenswrapper[4555]: E1209 14:59:29.160666 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:29 crc kubenswrapper[4555]: E1209 14:59:29.160729 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.206185 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.206222 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.206233 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.206246 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.206256 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:29Z","lastTransitionTime":"2025-12-09T14:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.307991 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.308024 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.308032 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.308043 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.308050 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:29Z","lastTransitionTime":"2025-12-09T14:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.409740 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.409774 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.409784 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.409796 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.409804 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:29Z","lastTransitionTime":"2025-12-09T14:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.512164 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.512197 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.512206 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.512217 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.512226 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:29Z","lastTransitionTime":"2025-12-09T14:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.613553 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.613594 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.613607 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.613621 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.613631 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:29Z","lastTransitionTime":"2025-12-09T14:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.715657 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.715697 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.715708 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.715721 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.715729 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:29Z","lastTransitionTime":"2025-12-09T14:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.817017 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.817047 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.817056 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.817067 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.817075 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:29Z","lastTransitionTime":"2025-12-09T14:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.919130 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.919175 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.919184 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.919196 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:29 crc kubenswrapper[4555]: I1209 14:59:29.919204 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:29Z","lastTransitionTime":"2025-12-09T14:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.020802 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.020858 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.020868 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.020883 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.020895 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:30Z","lastTransitionTime":"2025-12-09T14:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.122775 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.122836 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.122845 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.122857 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.122864 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:30Z","lastTransitionTime":"2025-12-09T14:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.160471 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:30 crc kubenswrapper[4555]: E1209 14:59:30.160556 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.168860 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffcfa1a3-f24c-4809-8468-636ab9b1a063\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c1f863d399da019632ceb8d698cf8aa186d0f533ff8a49da7c66dd11086253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa09f69008b086bb69e768437b40d896803a11f82f6949202ed0a456f3a213be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b44fa2c57434247fea11c1e1fbed34ad1df532e7e6bba0623634541322627f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a30d185f354ca39197fe30559783400a7d218ef0a162fd5fa5e5039d1946d127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30d185f354ca39197fe30559783400a7d218ef0a162fd5fa5e5039d1946d127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.176984 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d81323acabedcf739459ccc35af7dcbf68cd1159c9304c0ef841d774287425cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.184546 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.192093 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://222e3c51a925bbfd932c3620f13ad919006e2dad2c9063c60559ece4dd239f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.200020 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gjjcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e34542c-29e8-43d3-add3-42aa080e0c8a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73a16b54312a5c81ee3d5b4753cd53c9a01cc7f174b340b1f0db44b019541fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:18Z\\\",\\\"message\\\":\\\"2025-12-09T14:58:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_43b55716-79db-4d3f-93e7-dbc41035fea1\\\\n2025-12-09T14:58:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_43b55716-79db-4d3f-93e7-dbc41035fea1 to /host/opt/cni/bin/\\\\n2025-12-09T14:58:33Z [verbose] multus-daemon started\\\\n2025-12-09T14:58:33Z [verbose] Readiness Indicator file check\\\\n2025-12-09T14:59:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gjjcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.212176 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c87d128c-39da-4880-9efc-7d2e6319a854\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2496bb1d7c5b48f9944b2026978276ba9cf676ecd42dff619698a1ab8776269\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2496bb1d7c5b48f9944b2026978276ba9cf676ecd42dff619698a1ab8776269\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:23Z\\\",\\\"message\\\":\\\"router-internal-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:1936, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1209 14:59:23.772112 6583 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}\\\\nF1209 14:59:23.772066 6583 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default nod\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:23Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p4bl4_openshift-ovn-kubernetes(c87d128c-39da-4880-9efc-7d2e6319a854)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7p8n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p4bl4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.222211 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-72msc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4f13f82-a44f-461d-93b0-916a7ed0b925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06c787dc46c305e7388f67f85601eb3e724c501d4da0504d4f8fd569f533c634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2spqd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-72msc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.224098 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.224128 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.224138 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.224149 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.224157 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:30Z","lastTransitionTime":"2025-12-09T14:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.234137 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cb89a32-a132-4a5c-88ca-e561d2462abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8452332d27457e371020c42533ef78285e33851568eed69be71bc9d10e663b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32422f3e8abddcc0d533cce0b4fc55c65309e00a02d4757d415d6f63f9b2e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9db356805ba4080379f39ddf432276ac77417504341e77c1cc2d704756af1f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d7f8a2e6cf94de70026192f2660d59f31fa1675a7c7e432767db5bf58c1b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://630ad7ccda8e51ae4d116b5821e71ebcc3fb68b4614de58470610807ad351ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f964fbe7500852e1d6556136f949f474f8c491b9281bac611587caf277880766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2da96751421ec75b55016754205f19f71c21879724ba92f73ffbfdc81d8e1f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc0b7d4c7e77635969ae7b36067816e59cdeed0abb4c5eaf17810d0e272f8820\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.241723 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.249405 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd5a66337668da0b71f8033ec52d3d1ed9ac69f468d6e48c1e863204ec6a6349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2fd50693723f3d30a7f5fb890ed870f55b357017b85cdf7206ffc97d0d3d9b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.256149 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04e20c1851a9313f071ddc6ae1191e467f3bea284ea6e0defd914dc5c926b7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff6nf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6p6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.264821 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd8f66c1-da34-466f-ba53-4848c96b1287\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ae3cc97b46fa36f9bd6f439124d81700af58cb44671d46ffa2936260f6dda4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2028082f4c35a431ec183abac3ace39ae9d9d292a8b3b879cf35cf03dc606d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6498db60b815cc152db11a9d809f415e6ed632f66cf7eaeecbacb230d59dbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd308a4726360ffaa40fd9895d4684062d998a9dec9c76222a4be5140ece0ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e047250a4bced8e37ee26d9ba3065a61294fa9cc6d9491b6b96e35758b431edc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24862fb9296afeb3ec86c0cd5e862021c7e4c8a0f139f9fd11dfd1faba2b392\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ac07e538318096b1c9c2c8d0608f0b4d9b4367df0bb24575591b2f3f6c28c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrmwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tpsq2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.271397 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae951ddc-0e90-497c-af85-8e39088f32d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59535f0584581ae9d42fa3c1d9f4f204c8a2c7c96d211a2b8d74806eb3901a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a5e10f192fc24405b980ce611a09f1e0c6d49c4c94df89bceef151a747f1aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwhfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qchk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.277460 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3a4e758-0a2b-45c2-9314-85caacad3889\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6504dd2b92e68c220a759270fa99ddc38e403112a4c10b4eaf5d15539769c3fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7803e784d08586ae7bb48c47ef9e647dd0e53d5eb079ac8f4cfd1b74bcd15564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7803e784d08586ae7bb48c47ef9e647dd0e53d5eb079ac8f4cfd1b74bcd15564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.284001 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t6kl9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c97d4dc5-ce47-4a7b-9511-7168db0ee4b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd97a458fe790bf9d7a8587a77d464c8bc4f97919bf4646660d92bd016be2707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47xhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t6kl9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.291912 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23358419-437c-41bf-bf1f-0da572d7ae77\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5ec008c2fa2ff6b0bffe7a32708ff1d0bfcfc0b6edbdd084977b99a2051bdc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6099f0072464b66ae672e70c295fe21a13657b81cc71dcecc55314cd91816c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8f1c6dd89c91a8ff2ca8baf361ee78811eeb8e10f2c4bb0504644cdd1302d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.299645 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.306478 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sc2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f7e7bd-b667-42fa-b040-4a186e4570e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mc8nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sc2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.314291 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:30Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.326332 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.326360 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.326368 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.326379 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.326388 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:30Z","lastTransitionTime":"2025-12-09T14:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.427943 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.427970 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.427979 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.427990 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.427997 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:30Z","lastTransitionTime":"2025-12-09T14:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.529512 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.529537 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.529545 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.529554 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.529561 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:30Z","lastTransitionTime":"2025-12-09T14:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.630980 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.631005 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.631014 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.631023 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.631029 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:30Z","lastTransitionTime":"2025-12-09T14:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.732798 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.732835 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.732842 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.732852 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.732859 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:30Z","lastTransitionTime":"2025-12-09T14:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.834392 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.834421 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.834451 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.834462 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.834470 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:30Z","lastTransitionTime":"2025-12-09T14:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.936060 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.936088 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.936097 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.936106 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:30 crc kubenswrapper[4555]: I1209 14:59:30.936113 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:30Z","lastTransitionTime":"2025-12-09T14:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.037676 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.037708 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.037718 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.037730 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.037743 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:31Z","lastTransitionTime":"2025-12-09T14:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.069551 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 14:59:31 crc kubenswrapper[4555]: E1209 14:59:31.069678 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:35.069654968 +0000 UTC m=+145.206367717 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.069771 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.069847 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:31 crc kubenswrapper[4555]: E1209 14:59:31.069897 4555 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:59:31 crc kubenswrapper[4555]: E1209 14:59:31.069947 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 15:00:35.069932482 +0000 UTC m=+145.206645210 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:59:31 crc kubenswrapper[4555]: E1209 14:59:31.069979 4555 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:59:31 crc kubenswrapper[4555]: E1209 14:59:31.070014 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 15:00:35.070007854 +0000 UTC m=+145.206720572 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.139032 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.139073 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.139083 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.139094 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.139101 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:31Z","lastTransitionTime":"2025-12-09T14:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.160432 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.160476 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:31 crc kubenswrapper[4555]: E1209 14:59:31.160504 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.160544 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:31 crc kubenswrapper[4555]: E1209 14:59:31.160592 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:31 crc kubenswrapper[4555]: E1209 14:59:31.160748 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.170208 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.170235 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:31 crc kubenswrapper[4555]: E1209 14:59:31.170325 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:59:31 crc kubenswrapper[4555]: E1209 14:59:31.170342 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:59:31 crc kubenswrapper[4555]: E1209 14:59:31.170352 4555 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:59:31 crc kubenswrapper[4555]: E1209 14:59:31.170403 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 15:00:35.170391692 +0000 UTC m=+145.307104410 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:59:31 crc kubenswrapper[4555]: E1209 14:59:31.170409 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:59:31 crc kubenswrapper[4555]: E1209 14:59:31.170429 4555 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:59:31 crc kubenswrapper[4555]: E1209 14:59:31.170440 4555 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:59:31 crc kubenswrapper[4555]: E1209 14:59:31.170479 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 15:00:35.170466634 +0000 UTC m=+145.307179352 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.241550 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.241573 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.241581 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.241590 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.241597 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:31Z","lastTransitionTime":"2025-12-09T14:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.343269 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.343293 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.343315 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.343326 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.343336 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:31Z","lastTransitionTime":"2025-12-09T14:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.430697 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.430736 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.430747 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.430758 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.430766 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:31Z","lastTransitionTime":"2025-12-09T14:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:31 crc kubenswrapper[4555]: E1209 14:59:31.438741 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.441145 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.441170 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.441177 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.441189 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.441197 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:31Z","lastTransitionTime":"2025-12-09T14:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:31 crc kubenswrapper[4555]: E1209 14:59:31.449084 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.451320 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.451344 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.451352 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.451361 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.451368 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:31Z","lastTransitionTime":"2025-12-09T14:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:31 crc kubenswrapper[4555]: E1209 14:59:31.458607 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.460805 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.460908 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.460994 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.461053 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.461111 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:31Z","lastTransitionTime":"2025-12-09T14:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:31 crc kubenswrapper[4555]: E1209 14:59:31.468916 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.470945 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.471026 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.471100 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.471161 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.471220 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:31Z","lastTransitionTime":"2025-12-09T14:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:31 crc kubenswrapper[4555]: E1209 14:59:31.478789 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"73387e21-fe11-4735-b3c1-0a3cfb47f8fd\\\",\\\"systemUUID\\\":\\\"bd9f288e-1f46-4e96-a21f-7eb84ab89d9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:31 crc kubenswrapper[4555]: E1209 14:59:31.478940 4555 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.479812 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.479901 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.479968 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.480020 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.480093 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:31Z","lastTransitionTime":"2025-12-09T14:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.581935 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.581967 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.581975 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.581986 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.581994 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:31Z","lastTransitionTime":"2025-12-09T14:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.683840 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.683867 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.683874 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.683886 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.683894 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:31Z","lastTransitionTime":"2025-12-09T14:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.785521 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.785556 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.785565 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.785578 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.785587 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:31Z","lastTransitionTime":"2025-12-09T14:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.887205 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.887243 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.887252 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.887268 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.887279 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:31Z","lastTransitionTime":"2025-12-09T14:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.988784 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.988830 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.988839 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.988851 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:31 crc kubenswrapper[4555]: I1209 14:59:31.988859 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:31Z","lastTransitionTime":"2025-12-09T14:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.091025 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.091057 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.091070 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.091080 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.091087 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:32Z","lastTransitionTime":"2025-12-09T14:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.160769 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:32 crc kubenswrapper[4555]: E1209 14:59:32.160905 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.192953 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.192986 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.192996 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.193008 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.193017 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:32Z","lastTransitionTime":"2025-12-09T14:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.295096 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.295137 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.295148 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.295163 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.295177 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:32Z","lastTransitionTime":"2025-12-09T14:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.396543 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.396570 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.396579 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.396588 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.396597 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:32Z","lastTransitionTime":"2025-12-09T14:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.498603 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.498643 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.498667 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.498685 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.498693 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:32Z","lastTransitionTime":"2025-12-09T14:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.600141 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.600186 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.600195 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.600206 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.600216 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:32Z","lastTransitionTime":"2025-12-09T14:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.701783 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.701828 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.701837 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.701850 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.701858 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:32Z","lastTransitionTime":"2025-12-09T14:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.803393 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.803537 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.803624 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.803681 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.803737 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:32Z","lastTransitionTime":"2025-12-09T14:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.905831 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.905861 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.905869 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.905879 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:32 crc kubenswrapper[4555]: I1209 14:59:32.905886 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:32Z","lastTransitionTime":"2025-12-09T14:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.007697 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.007728 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.007737 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.007751 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.007761 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:33Z","lastTransitionTime":"2025-12-09T14:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.109891 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.110017 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.110097 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.110156 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.110214 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:33Z","lastTransitionTime":"2025-12-09T14:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.160330 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.160384 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:33 crc kubenswrapper[4555]: E1209 14:59:33.160474 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.160487 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:33 crc kubenswrapper[4555]: E1209 14:59:33.160564 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:33 crc kubenswrapper[4555]: E1209 14:59:33.160606 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.212404 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.212428 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.212435 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.212446 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.212455 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:33Z","lastTransitionTime":"2025-12-09T14:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.314574 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.314601 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.314610 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.314623 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.314632 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:33Z","lastTransitionTime":"2025-12-09T14:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.416011 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.416057 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.416066 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.416080 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.416098 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:33Z","lastTransitionTime":"2025-12-09T14:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.517554 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.517584 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.517594 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.517622 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.517631 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:33Z","lastTransitionTime":"2025-12-09T14:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.619951 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.620002 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.620012 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.620027 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.620036 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:33Z","lastTransitionTime":"2025-12-09T14:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.721710 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.721745 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.721753 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.721766 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.721775 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:33Z","lastTransitionTime":"2025-12-09T14:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.823799 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.823841 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.823849 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.823861 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.823869 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:33Z","lastTransitionTime":"2025-12-09T14:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.925523 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.925691 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.925700 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.925711 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:33 crc kubenswrapper[4555]: I1209 14:59:33.925719 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:33Z","lastTransitionTime":"2025-12-09T14:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.027146 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.027178 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.027186 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.027197 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.027205 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:34Z","lastTransitionTime":"2025-12-09T14:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.129127 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.129158 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.129166 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.129178 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.129188 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:34Z","lastTransitionTime":"2025-12-09T14:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.160741 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:34 crc kubenswrapper[4555]: E1209 14:59:34.160880 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.231076 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.231097 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.231105 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.231114 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.231123 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:34Z","lastTransitionTime":"2025-12-09T14:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.333242 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.333275 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.333284 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.333296 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.333324 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:34Z","lastTransitionTime":"2025-12-09T14:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.434886 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.434918 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.434928 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.434940 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.434948 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:34Z","lastTransitionTime":"2025-12-09T14:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.536916 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.537042 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.537101 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.537156 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.537218 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:34Z","lastTransitionTime":"2025-12-09T14:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.639539 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.639576 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.639585 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.639597 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.639605 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:34Z","lastTransitionTime":"2025-12-09T14:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.741680 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.741721 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.741731 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.741746 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.741756 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:34Z","lastTransitionTime":"2025-12-09T14:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.844172 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.844206 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.844214 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.844226 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.844233 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:34Z","lastTransitionTime":"2025-12-09T14:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.946051 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.946077 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.946086 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.946095 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:34 crc kubenswrapper[4555]: I1209 14:59:34.946103 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:34Z","lastTransitionTime":"2025-12-09T14:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.047626 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.047662 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.047670 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.047680 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.047686 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:35Z","lastTransitionTime":"2025-12-09T14:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.149797 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.149824 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.149831 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.149842 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.149850 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:35Z","lastTransitionTime":"2025-12-09T14:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.160043 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.160047 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.160091 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:35 crc kubenswrapper[4555]: E1209 14:59:35.160175 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:35 crc kubenswrapper[4555]: E1209 14:59:35.160213 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:35 crc kubenswrapper[4555]: E1209 14:59:35.160255 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.252055 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.252085 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.252094 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.252106 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.252115 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:35Z","lastTransitionTime":"2025-12-09T14:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.353349 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.353372 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.353380 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.353391 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.353399 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:35Z","lastTransitionTime":"2025-12-09T14:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.454464 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.454492 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.454500 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.454510 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.454517 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:35Z","lastTransitionTime":"2025-12-09T14:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.556488 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.556518 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.556528 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.556540 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.556551 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:35Z","lastTransitionTime":"2025-12-09T14:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.658579 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.658613 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.658622 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.658634 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.658646 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:35Z","lastTransitionTime":"2025-12-09T14:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.760886 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.760923 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.760933 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.760945 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.760953 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:35Z","lastTransitionTime":"2025-12-09T14:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.862690 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.862719 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.862727 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.862736 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.862744 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:35Z","lastTransitionTime":"2025-12-09T14:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.964835 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.964863 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.964871 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.964880 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:35 crc kubenswrapper[4555]: I1209 14:59:35.964888 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:35Z","lastTransitionTime":"2025-12-09T14:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.067062 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.067098 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.067106 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.067120 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.067128 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:36Z","lastTransitionTime":"2025-12-09T14:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.160126 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:36 crc kubenswrapper[4555]: E1209 14:59:36.160209 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.168833 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.168861 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.168870 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.168880 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.168887 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:36Z","lastTransitionTime":"2025-12-09T14:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.270986 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.271020 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.271029 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.271042 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.271049 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:36Z","lastTransitionTime":"2025-12-09T14:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.372561 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.372603 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.372614 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.372627 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.372637 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:36Z","lastTransitionTime":"2025-12-09T14:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.474871 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.474905 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.474915 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.474928 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.474939 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:36Z","lastTransitionTime":"2025-12-09T14:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.576890 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.576918 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.576927 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.576936 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.576943 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:36Z","lastTransitionTime":"2025-12-09T14:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.678635 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.678665 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.678674 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.678691 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.678699 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:36Z","lastTransitionTime":"2025-12-09T14:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.780159 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.780189 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.780199 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.780209 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.780218 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:36Z","lastTransitionTime":"2025-12-09T14:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.881925 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.881960 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.881969 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.881985 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.881993 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:36Z","lastTransitionTime":"2025-12-09T14:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.983875 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.983913 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.983921 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.983934 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:36 crc kubenswrapper[4555]: I1209 14:59:36.983943 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:36Z","lastTransitionTime":"2025-12-09T14:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.085903 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.085931 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.085939 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.085950 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.085957 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:37Z","lastTransitionTime":"2025-12-09T14:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.160382 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.160412 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:37 crc kubenswrapper[4555]: E1209 14:59:37.160454 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.160506 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:37 crc kubenswrapper[4555]: E1209 14:59:37.160595 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:37 crc kubenswrapper[4555]: E1209 14:59:37.160704 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.187849 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.187876 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.187884 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.187893 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.187900 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:37Z","lastTransitionTime":"2025-12-09T14:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.289395 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.289421 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.289430 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.289439 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.289445 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:37Z","lastTransitionTime":"2025-12-09T14:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.390974 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.391000 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.391008 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.391017 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.391024 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:37Z","lastTransitionTime":"2025-12-09T14:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.492533 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.492573 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.492612 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.492627 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.492636 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:37Z","lastTransitionTime":"2025-12-09T14:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.594465 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.594496 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.594505 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.594516 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.594524 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:37Z","lastTransitionTime":"2025-12-09T14:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.696042 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.696085 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.696094 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.696106 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.696115 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:37Z","lastTransitionTime":"2025-12-09T14:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.797833 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.797860 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.797870 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.797880 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.797888 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:37Z","lastTransitionTime":"2025-12-09T14:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.900156 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.900190 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.900198 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.900211 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:37 crc kubenswrapper[4555]: I1209 14:59:37.900220 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:37Z","lastTransitionTime":"2025-12-09T14:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.002008 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.002032 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.002040 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.002052 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.002060 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:38Z","lastTransitionTime":"2025-12-09T14:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.103665 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.103694 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.103703 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.103713 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.103721 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:38Z","lastTransitionTime":"2025-12-09T14:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.160805 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:38 crc kubenswrapper[4555]: E1209 14:59:38.160946 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.205669 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.205700 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.205709 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.205721 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.205729 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:38Z","lastTransitionTime":"2025-12-09T14:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.307638 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.307671 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.307679 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.307688 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.307695 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:38Z","lastTransitionTime":"2025-12-09T14:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.409111 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.409146 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.409154 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.409165 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.409173 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:38Z","lastTransitionTime":"2025-12-09T14:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.511217 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.511246 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.511254 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.511264 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.511288 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:38Z","lastTransitionTime":"2025-12-09T14:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.613392 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.613424 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.613432 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.613444 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.613452 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:38Z","lastTransitionTime":"2025-12-09T14:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.715354 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.715390 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.715401 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.715415 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.715425 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:38Z","lastTransitionTime":"2025-12-09T14:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.816881 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.816904 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.816912 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.816921 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.816929 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:38Z","lastTransitionTime":"2025-12-09T14:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.918531 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.918554 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.918565 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.918574 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:38 crc kubenswrapper[4555]: I1209 14:59:38.918581 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:38Z","lastTransitionTime":"2025-12-09T14:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.020231 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.020259 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.020267 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.020277 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.020284 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:39Z","lastTransitionTime":"2025-12-09T14:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.121688 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.121710 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.121718 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.121727 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.121733 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:39Z","lastTransitionTime":"2025-12-09T14:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.160983 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.161047 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.161115 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:39 crc kubenswrapper[4555]: E1209 14:59:39.161054 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:39 crc kubenswrapper[4555]: E1209 14:59:39.161147 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:39 crc kubenswrapper[4555]: E1209 14:59:39.161238 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.223430 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.223457 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.223465 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.223475 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.223483 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:39Z","lastTransitionTime":"2025-12-09T14:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.325035 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.325056 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.325064 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.325072 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.325079 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:39Z","lastTransitionTime":"2025-12-09T14:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.426280 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.426296 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.426321 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.426330 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.426337 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:39Z","lastTransitionTime":"2025-12-09T14:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.528061 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.528093 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.528102 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.528115 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.528124 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:39Z","lastTransitionTime":"2025-12-09T14:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.629681 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.629717 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.629726 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.629739 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.629757 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:39Z","lastTransitionTime":"2025-12-09T14:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.731331 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.731361 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.731369 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.731379 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.731388 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:39Z","lastTransitionTime":"2025-12-09T14:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.833003 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.833031 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.833039 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.833049 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.833057 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:39Z","lastTransitionTime":"2025-12-09T14:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.934885 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.934915 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.934924 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.934936 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:39 crc kubenswrapper[4555]: I1209 14:59:39.934944 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:39Z","lastTransitionTime":"2025-12-09T14:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.037024 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.037061 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.037069 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.037083 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.037096 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:40Z","lastTransitionTime":"2025-12-09T14:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.138568 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.138600 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.138609 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.138621 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.138629 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:40Z","lastTransitionTime":"2025-12-09T14:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.160153 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:40 crc kubenswrapper[4555]: E1209 14:59:40.160250 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.162236 4555 scope.go:117] "RemoveContainer" containerID="f2496bb1d7c5b48f9944b2026978276ba9cf676ecd42dff619698a1ab8776269" Dec 09 14:59:40 crc kubenswrapper[4555]: E1209 14:59:40.163286 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p4bl4_openshift-ovn-kubernetes(c87d128c-39da-4880-9efc-7d2e6319a854)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.181108 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-tpsq2" podStartSLOduration=68.181097454 podStartE2EDuration="1m8.181097454s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 14:59:40.181046879 +0000 UTC m=+90.317759597" watchObservedRunningTime="2025-12-09 14:59:40.181097454 +0000 UTC m=+90.317810172" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.198577 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qchk6" podStartSLOduration=68.198560728 podStartE2EDuration="1m8.198560728s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 14:59:40.192151044 +0000 UTC m=+90.328863762" watchObservedRunningTime="2025-12-09 14:59:40.198560728 +0000 UTC m=+90.335273445" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.205701 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=18.20569564 podStartE2EDuration="18.20569564s" podCreationTimestamp="2025-12-09 14:59:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 14:59:40.198655887 +0000 UTC m=+90.335368605" watchObservedRunningTime="2025-12-09 14:59:40.20569564 +0000 UTC m=+90.342408349" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.224076 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podStartSLOduration=68.224066475 podStartE2EDuration="1m8.224066475s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 14:59:40.224012243 +0000 UTC m=+90.360724962" watchObservedRunningTime="2025-12-09 14:59:40.224066475 +0000 UTC m=+90.360779193" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.230153 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-t6kl9" podStartSLOduration=68.230141598 podStartE2EDuration="1m8.230141598s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 14:59:40.230056178 +0000 UTC m=+90.366768896" watchObservedRunningTime="2025-12-09 14:59:40.230141598 +0000 UTC m=+90.366854327" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.240068 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.240097 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.240106 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.240117 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.240125 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:40Z","lastTransitionTime":"2025-12-09T14:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.249539 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=69.249526786 podStartE2EDuration="1m9.249526786s" podCreationTimestamp="2025-12-09 14:58:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 14:59:40.249240798 +0000 UTC m=+90.385953516" watchObservedRunningTime="2025-12-09 14:59:40.249526786 +0000 UTC m=+90.386239505" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.249756 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=73.249751601 podStartE2EDuration="1m13.249751601s" podCreationTimestamp="2025-12-09 14:58:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 14:59:40.24032563 +0000 UTC m=+90.377038347" watchObservedRunningTime="2025-12-09 14:59:40.249751601 +0000 UTC m=+90.386464318" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.279106 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-gjjcm" podStartSLOduration=68.279090985 podStartE2EDuration="1m8.279090985s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 14:59:40.279079273 +0000 UTC m=+90.415791992" watchObservedRunningTime="2025-12-09 14:59:40.279090985 +0000 UTC m=+90.415803703" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.306400 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-72msc" podStartSLOduration=68.306386276 podStartE2EDuration="1m8.306386276s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 14:59:40.305786897 +0000 UTC m=+90.442499615" watchObservedRunningTime="2025-12-09 14:59:40.306386276 +0000 UTC m=+90.443098995" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.332639 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=71.332622561 podStartE2EDuration="1m11.332622561s" podCreationTimestamp="2025-12-09 14:58:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 14:59:40.331423741 +0000 UTC m=+90.468136458" watchObservedRunningTime="2025-12-09 14:59:40.332622561 +0000 UTC m=+90.469335278" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.342131 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.342165 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.342191 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.342205 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.342213 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:40Z","lastTransitionTime":"2025-12-09T14:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.342341 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=41.342329121 podStartE2EDuration="41.342329121s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 14:59:40.341475572 +0000 UTC m=+90.478188290" watchObservedRunningTime="2025-12-09 14:59:40.342329121 +0000 UTC m=+90.479041839" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.444050 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.444099 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.444109 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.444120 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.444129 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:40Z","lastTransitionTime":"2025-12-09T14:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.546337 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.546365 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.546376 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.546406 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.546417 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:40Z","lastTransitionTime":"2025-12-09T14:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.648656 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.648698 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.648708 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.648724 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.648742 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:40Z","lastTransitionTime":"2025-12-09T14:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.750833 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.750864 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.750873 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.750883 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.750891 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:40Z","lastTransitionTime":"2025-12-09T14:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.852574 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.852607 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.852614 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.852626 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.852636 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:40Z","lastTransitionTime":"2025-12-09T14:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.953881 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.953911 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.953919 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.953930 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:40 crc kubenswrapper[4555]: I1209 14:59:40.953937 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:40Z","lastTransitionTime":"2025-12-09T14:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.055537 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.055566 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.055574 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.055590 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.055597 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:41Z","lastTransitionTime":"2025-12-09T14:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.157283 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.157328 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.157337 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.157348 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.157356 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:41Z","lastTransitionTime":"2025-12-09T14:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.160664 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.160691 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.160693 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:41 crc kubenswrapper[4555]: E1209 14:59:41.160762 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:41 crc kubenswrapper[4555]: E1209 14:59:41.160850 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:41 crc kubenswrapper[4555]: E1209 14:59:41.160886 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.259378 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.259412 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.259421 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.259434 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.259442 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:41Z","lastTransitionTime":"2025-12-09T14:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.361504 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.361534 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.361543 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.361553 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.361562 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:41Z","lastTransitionTime":"2025-12-09T14:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.463554 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.463580 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.463589 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.463600 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.463608 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:41Z","lastTransitionTime":"2025-12-09T14:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.565022 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.565050 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.565059 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.565091 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.565101 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:41Z","lastTransitionTime":"2025-12-09T14:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.666692 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.666725 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.666743 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.666752 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.666759 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:41Z","lastTransitionTime":"2025-12-09T14:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.768636 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.768672 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.768682 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.768700 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.768709 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:41Z","lastTransitionTime":"2025-12-09T14:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.837708 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.837743 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.837751 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.837761 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.837768 4555 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:41Z","lastTransitionTime":"2025-12-09T14:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.863098 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4s8"] Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.863423 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4s8" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.865414 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.865507 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.865770 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.865912 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.961440 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9513e5a2-58fa-4322-b079-4d6af4ee0d45-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lk4s8\" (UID: \"9513e5a2-58fa-4322-b079-4d6af4ee0d45\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4s8" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.961474 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9513e5a2-58fa-4322-b079-4d6af4ee0d45-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lk4s8\" (UID: \"9513e5a2-58fa-4322-b079-4d6af4ee0d45\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4s8" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.961492 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9513e5a2-58fa-4322-b079-4d6af4ee0d45-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lk4s8\" (UID: \"9513e5a2-58fa-4322-b079-4d6af4ee0d45\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4s8" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.961517 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9513e5a2-58fa-4322-b079-4d6af4ee0d45-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lk4s8\" (UID: \"9513e5a2-58fa-4322-b079-4d6af4ee0d45\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4s8" Dec 09 14:59:41 crc kubenswrapper[4555]: I1209 14:59:41.961539 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9513e5a2-58fa-4322-b079-4d6af4ee0d45-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lk4s8\" (UID: \"9513e5a2-58fa-4322-b079-4d6af4ee0d45\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4s8" Dec 09 14:59:42 crc kubenswrapper[4555]: I1209 14:59:42.062492 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9513e5a2-58fa-4322-b079-4d6af4ee0d45-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lk4s8\" (UID: \"9513e5a2-58fa-4322-b079-4d6af4ee0d45\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4s8" Dec 09 14:59:42 crc kubenswrapper[4555]: I1209 14:59:42.062532 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9513e5a2-58fa-4322-b079-4d6af4ee0d45-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lk4s8\" (UID: \"9513e5a2-58fa-4322-b079-4d6af4ee0d45\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4s8" Dec 09 14:59:42 crc kubenswrapper[4555]: I1209 14:59:42.062580 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9513e5a2-58fa-4322-b079-4d6af4ee0d45-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lk4s8\" (UID: \"9513e5a2-58fa-4322-b079-4d6af4ee0d45\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4s8" Dec 09 14:59:42 crc kubenswrapper[4555]: I1209 14:59:42.062601 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9513e5a2-58fa-4322-b079-4d6af4ee0d45-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lk4s8\" (UID: \"9513e5a2-58fa-4322-b079-4d6af4ee0d45\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4s8" Dec 09 14:59:42 crc kubenswrapper[4555]: I1209 14:59:42.062615 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9513e5a2-58fa-4322-b079-4d6af4ee0d45-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lk4s8\" (UID: \"9513e5a2-58fa-4322-b079-4d6af4ee0d45\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4s8" Dec 09 14:59:42 crc kubenswrapper[4555]: I1209 14:59:42.062665 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9513e5a2-58fa-4322-b079-4d6af4ee0d45-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lk4s8\" (UID: \"9513e5a2-58fa-4322-b079-4d6af4ee0d45\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4s8" Dec 09 14:59:42 crc kubenswrapper[4555]: I1209 14:59:42.062665 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9513e5a2-58fa-4322-b079-4d6af4ee0d45-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lk4s8\" (UID: \"9513e5a2-58fa-4322-b079-4d6af4ee0d45\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4s8" Dec 09 14:59:42 crc kubenswrapper[4555]: I1209 14:59:42.063367 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9513e5a2-58fa-4322-b079-4d6af4ee0d45-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lk4s8\" (UID: \"9513e5a2-58fa-4322-b079-4d6af4ee0d45\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4s8" Dec 09 14:59:42 crc kubenswrapper[4555]: I1209 14:59:42.066848 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9513e5a2-58fa-4322-b079-4d6af4ee0d45-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lk4s8\" (UID: \"9513e5a2-58fa-4322-b079-4d6af4ee0d45\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4s8" Dec 09 14:59:42 crc kubenswrapper[4555]: I1209 14:59:42.075815 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9513e5a2-58fa-4322-b079-4d6af4ee0d45-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lk4s8\" (UID: \"9513e5a2-58fa-4322-b079-4d6af4ee0d45\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4s8" Dec 09 14:59:42 crc kubenswrapper[4555]: I1209 14:59:42.160299 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:42 crc kubenswrapper[4555]: E1209 14:59:42.160418 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:42 crc kubenswrapper[4555]: I1209 14:59:42.174080 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4s8" Dec 09 14:59:42 crc kubenswrapper[4555]: W1209 14:59:42.184109 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9513e5a2_58fa_4322_b079_4d6af4ee0d45.slice/crio-250878d080fc656c89f51b954771b855e1a463727faeecbbbad09ebcb25e0990 WatchSource:0}: Error finding container 250878d080fc656c89f51b954771b855e1a463727faeecbbbad09ebcb25e0990: Status 404 returned error can't find the container with id 250878d080fc656c89f51b954771b855e1a463727faeecbbbad09ebcb25e0990 Dec 09 14:59:42 crc kubenswrapper[4555]: I1209 14:59:42.456444 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4s8" event={"ID":"9513e5a2-58fa-4322-b079-4d6af4ee0d45","Type":"ContainerStarted","Data":"6cbffd8ee5f7ab9a525976936124102ee77e5c67dadc226380410f709eaadca9"} Dec 09 14:59:42 crc kubenswrapper[4555]: I1209 14:59:42.456486 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4s8" event={"ID":"9513e5a2-58fa-4322-b079-4d6af4ee0d45","Type":"ContainerStarted","Data":"250878d080fc656c89f51b954771b855e1a463727faeecbbbad09ebcb25e0990"} Dec 09 14:59:42 crc kubenswrapper[4555]: I1209 14:59:42.465279 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4s8" podStartSLOduration=70.465261202 podStartE2EDuration="1m10.465261202s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 14:59:42.465090629 +0000 UTC m=+92.601803347" watchObservedRunningTime="2025-12-09 14:59:42.465261202 +0000 UTC m=+92.601973930" Dec 09 14:59:43 crc kubenswrapper[4555]: I1209 14:59:43.160700 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:43 crc kubenswrapper[4555]: I1209 14:59:43.160763 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:43 crc kubenswrapper[4555]: E1209 14:59:43.160802 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:43 crc kubenswrapper[4555]: I1209 14:59:43.160811 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:43 crc kubenswrapper[4555]: E1209 14:59:43.160970 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:43 crc kubenswrapper[4555]: E1209 14:59:43.161000 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:44 crc kubenswrapper[4555]: I1209 14:59:44.160504 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:44 crc kubenswrapper[4555]: E1209 14:59:44.160627 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:45 crc kubenswrapper[4555]: I1209 14:59:45.160261 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:45 crc kubenswrapper[4555]: I1209 14:59:45.160291 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:45 crc kubenswrapper[4555]: E1209 14:59:45.160364 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:45 crc kubenswrapper[4555]: I1209 14:59:45.160384 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:45 crc kubenswrapper[4555]: E1209 14:59:45.160579 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:45 crc kubenswrapper[4555]: E1209 14:59:45.160691 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:46 crc kubenswrapper[4555]: I1209 14:59:46.160695 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:46 crc kubenswrapper[4555]: E1209 14:59:46.160848 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:47 crc kubenswrapper[4555]: I1209 14:59:47.160664 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:47 crc kubenswrapper[4555]: I1209 14:59:47.160728 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:47 crc kubenswrapper[4555]: I1209 14:59:47.160800 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:47 crc kubenswrapper[4555]: E1209 14:59:47.160900 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:47 crc kubenswrapper[4555]: E1209 14:59:47.161086 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:47 crc kubenswrapper[4555]: E1209 14:59:47.161157 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:48 crc kubenswrapper[4555]: I1209 14:59:48.160799 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:48 crc kubenswrapper[4555]: E1209 14:59:48.160883 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:49 crc kubenswrapper[4555]: I1209 14:59:49.160821 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:49 crc kubenswrapper[4555]: I1209 14:59:49.160873 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:49 crc kubenswrapper[4555]: E1209 14:59:49.160912 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:49 crc kubenswrapper[4555]: I1209 14:59:49.160836 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:49 crc kubenswrapper[4555]: E1209 14:59:49.161006 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:49 crc kubenswrapper[4555]: E1209 14:59:49.161115 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:50 crc kubenswrapper[4555]: I1209 14:59:50.160461 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:50 crc kubenswrapper[4555]: E1209 14:59:50.161072 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:50 crc kubenswrapper[4555]: I1209 14:59:50.330367 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/35f7e7bd-b667-42fa-b040-4a186e4570e3-metrics-certs\") pod \"network-metrics-daemon-6sc2q\" (UID: \"35f7e7bd-b667-42fa-b040-4a186e4570e3\") " pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:50 crc kubenswrapper[4555]: E1209 14:59:50.330508 4555 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:59:50 crc kubenswrapper[4555]: E1209 14:59:50.330560 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35f7e7bd-b667-42fa-b040-4a186e4570e3-metrics-certs podName:35f7e7bd-b667-42fa-b040-4a186e4570e3 nodeName:}" failed. No retries permitted until 2025-12-09 15:00:54.33054763 +0000 UTC m=+164.467260348 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/35f7e7bd-b667-42fa-b040-4a186e4570e3-metrics-certs") pod "network-metrics-daemon-6sc2q" (UID: "35f7e7bd-b667-42fa-b040-4a186e4570e3") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:59:51 crc kubenswrapper[4555]: I1209 14:59:51.160991 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:51 crc kubenswrapper[4555]: I1209 14:59:51.161056 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:51 crc kubenswrapper[4555]: I1209 14:59:51.161095 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:51 crc kubenswrapper[4555]: E1209 14:59:51.161212 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:51 crc kubenswrapper[4555]: E1209 14:59:51.161272 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:51 crc kubenswrapper[4555]: E1209 14:59:51.161533 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:51 crc kubenswrapper[4555]: I1209 14:59:51.161695 4555 scope.go:117] "RemoveContainer" containerID="f2496bb1d7c5b48f9944b2026978276ba9cf676ecd42dff619698a1ab8776269" Dec 09 14:59:51 crc kubenswrapper[4555]: E1209 14:59:51.161810 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p4bl4_openshift-ovn-kubernetes(c87d128c-39da-4880-9efc-7d2e6319a854)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" Dec 09 14:59:52 crc kubenswrapper[4555]: I1209 14:59:52.161181 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:52 crc kubenswrapper[4555]: E1209 14:59:52.161351 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:53 crc kubenswrapper[4555]: I1209 14:59:53.160594 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:53 crc kubenswrapper[4555]: I1209 14:59:53.160695 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:53 crc kubenswrapper[4555]: E1209 14:59:53.160812 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:53 crc kubenswrapper[4555]: I1209 14:59:53.160908 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:53 crc kubenswrapper[4555]: E1209 14:59:53.160909 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:53 crc kubenswrapper[4555]: E1209 14:59:53.161021 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:54 crc kubenswrapper[4555]: I1209 14:59:54.160945 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:54 crc kubenswrapper[4555]: E1209 14:59:54.161083 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:55 crc kubenswrapper[4555]: I1209 14:59:55.160259 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:55 crc kubenswrapper[4555]: E1209 14:59:55.160528 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:55 crc kubenswrapper[4555]: I1209 14:59:55.160426 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:55 crc kubenswrapper[4555]: I1209 14:59:55.160384 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:55 crc kubenswrapper[4555]: E1209 14:59:55.160585 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:55 crc kubenswrapper[4555]: E1209 14:59:55.160701 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:56 crc kubenswrapper[4555]: I1209 14:59:56.160154 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:56 crc kubenswrapper[4555]: E1209 14:59:56.160287 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:57 crc kubenswrapper[4555]: I1209 14:59:57.161054 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:57 crc kubenswrapper[4555]: I1209 14:59:57.161150 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:57 crc kubenswrapper[4555]: E1209 14:59:57.161239 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:57 crc kubenswrapper[4555]: I1209 14:59:57.161394 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:57 crc kubenswrapper[4555]: E1209 14:59:57.161584 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:57 crc kubenswrapper[4555]: E1209 14:59:57.161735 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:58 crc kubenswrapper[4555]: I1209 14:59:58.160715 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 14:59:58 crc kubenswrapper[4555]: E1209 14:59:58.160815 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 14:59:59 crc kubenswrapper[4555]: I1209 14:59:59.160817 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:59 crc kubenswrapper[4555]: I1209 14:59:59.160838 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:59 crc kubenswrapper[4555]: I1209 14:59:59.161007 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:59 crc kubenswrapper[4555]: E1209 14:59:59.161587 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:59 crc kubenswrapper[4555]: E1209 14:59:59.161646 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:59 crc kubenswrapper[4555]: E1209 14:59:59.161737 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:00 crc kubenswrapper[4555]: I1209 15:00:00.160584 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 15:00:00 crc kubenswrapper[4555]: E1209 15:00:00.161404 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 15:00:01 crc kubenswrapper[4555]: I1209 15:00:01.160998 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:01 crc kubenswrapper[4555]: E1209 15:00:01.161092 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:01 crc kubenswrapper[4555]: I1209 15:00:01.161113 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:01 crc kubenswrapper[4555]: E1209 15:00:01.161199 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:01 crc kubenswrapper[4555]: I1209 15:00:01.161333 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:01 crc kubenswrapper[4555]: E1209 15:00:01.161612 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:02 crc kubenswrapper[4555]: I1209 15:00:02.160905 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 15:00:02 crc kubenswrapper[4555]: E1209 15:00:02.161007 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 15:00:03 crc kubenswrapper[4555]: I1209 15:00:03.160740 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:03 crc kubenswrapper[4555]: E1209 15:00:03.160832 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:03 crc kubenswrapper[4555]: I1209 15:00:03.160753 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:03 crc kubenswrapper[4555]: E1209 15:00:03.160902 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:03 crc kubenswrapper[4555]: I1209 15:00:03.160736 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:03 crc kubenswrapper[4555]: E1209 15:00:03.160954 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:04 crc kubenswrapper[4555]: I1209 15:00:04.160595 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 15:00:04 crc kubenswrapper[4555]: E1209 15:00:04.161024 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 15:00:04 crc kubenswrapper[4555]: I1209 15:00:04.161189 4555 scope.go:117] "RemoveContainer" containerID="f2496bb1d7c5b48f9944b2026978276ba9cf676ecd42dff619698a1ab8776269" Dec 09 15:00:04 crc kubenswrapper[4555]: I1209 15:00:04.501979 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p4bl4_c87d128c-39da-4880-9efc-7d2e6319a854/ovnkube-controller/3.log" Dec 09 15:00:04 crc kubenswrapper[4555]: I1209 15:00:04.504407 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" event={"ID":"c87d128c-39da-4880-9efc-7d2e6319a854","Type":"ContainerStarted","Data":"57d2adc84a2cd1f31fcedce27250e4e7255a9cc27f0d8422591d743dfb996466"} Dec 09 15:00:04 crc kubenswrapper[4555]: I1209 15:00:04.504756 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 15:00:04 crc kubenswrapper[4555]: I1209 15:00:04.522093 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" podStartSLOduration=92.522081981 podStartE2EDuration="1m32.522081981s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:04.521701363 +0000 UTC m=+114.658414081" watchObservedRunningTime="2025-12-09 15:00:04.522081981 +0000 UTC m=+114.658794699" Dec 09 15:00:04 crc kubenswrapper[4555]: I1209 15:00:04.754385 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6sc2q"] Dec 09 15:00:04 crc kubenswrapper[4555]: I1209 15:00:04.754472 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 15:00:04 crc kubenswrapper[4555]: E1209 15:00:04.754546 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 15:00:05 crc kubenswrapper[4555]: I1209 15:00:05.160460 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:05 crc kubenswrapper[4555]: E1209 15:00:05.160720 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:05 crc kubenswrapper[4555]: I1209 15:00:05.160474 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:05 crc kubenswrapper[4555]: I1209 15:00:05.160471 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:05 crc kubenswrapper[4555]: E1209 15:00:05.160772 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:05 crc kubenswrapper[4555]: E1209 15:00:05.160809 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:05 crc kubenswrapper[4555]: I1209 15:00:05.507512 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gjjcm_3e34542c-29e8-43d3-add3-42aa080e0c8a/kube-multus/1.log" Dec 09 15:00:05 crc kubenswrapper[4555]: I1209 15:00:05.507875 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gjjcm_3e34542c-29e8-43d3-add3-42aa080e0c8a/kube-multus/0.log" Dec 09 15:00:05 crc kubenswrapper[4555]: I1209 15:00:05.507912 4555 generic.go:334] "Generic (PLEG): container finished" podID="3e34542c-29e8-43d3-add3-42aa080e0c8a" containerID="73a16b54312a5c81ee3d5b4753cd53c9a01cc7f174b340b1f0db44b019541fbd" exitCode=1 Dec 09 15:00:05 crc kubenswrapper[4555]: I1209 15:00:05.508017 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gjjcm" event={"ID":"3e34542c-29e8-43d3-add3-42aa080e0c8a","Type":"ContainerDied","Data":"73a16b54312a5c81ee3d5b4753cd53c9a01cc7f174b340b1f0db44b019541fbd"} Dec 09 15:00:05 crc kubenswrapper[4555]: I1209 15:00:05.508190 4555 scope.go:117] "RemoveContainer" containerID="b138ef0764baeeae025ff996cd643edabbc1de8f567e73f41fa748f481929656" Dec 09 15:00:05 crc kubenswrapper[4555]: I1209 15:00:05.508494 4555 scope.go:117] "RemoveContainer" containerID="73a16b54312a5c81ee3d5b4753cd53c9a01cc7f174b340b1f0db44b019541fbd" Dec 09 15:00:05 crc kubenswrapper[4555]: E1209 15:00:05.508641 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-gjjcm_openshift-multus(3e34542c-29e8-43d3-add3-42aa080e0c8a)\"" pod="openshift-multus/multus-gjjcm" podUID="3e34542c-29e8-43d3-add3-42aa080e0c8a" Dec 09 15:00:06 crc kubenswrapper[4555]: I1209 15:00:06.161240 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 15:00:06 crc kubenswrapper[4555]: E1209 15:00:06.161401 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 15:00:06 crc kubenswrapper[4555]: I1209 15:00:06.510813 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gjjcm_3e34542c-29e8-43d3-add3-42aa080e0c8a/kube-multus/1.log" Dec 09 15:00:07 crc kubenswrapper[4555]: I1209 15:00:07.160364 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:07 crc kubenswrapper[4555]: I1209 15:00:07.160420 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:07 crc kubenswrapper[4555]: E1209 15:00:07.160456 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:07 crc kubenswrapper[4555]: I1209 15:00:07.160510 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:07 crc kubenswrapper[4555]: E1209 15:00:07.160639 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:07 crc kubenswrapper[4555]: E1209 15:00:07.160767 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:08 crc kubenswrapper[4555]: I1209 15:00:08.160258 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 15:00:08 crc kubenswrapper[4555]: E1209 15:00:08.160373 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 15:00:09 crc kubenswrapper[4555]: I1209 15:00:09.160417 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:09 crc kubenswrapper[4555]: E1209 15:00:09.160498 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:09 crc kubenswrapper[4555]: I1209 15:00:09.160423 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:09 crc kubenswrapper[4555]: E1209 15:00:09.160565 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:09 crc kubenswrapper[4555]: I1209 15:00:09.160423 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:09 crc kubenswrapper[4555]: E1209 15:00:09.160619 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:10 crc kubenswrapper[4555]: I1209 15:00:10.160052 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 15:00:10 crc kubenswrapper[4555]: E1209 15:00:10.160813 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 15:00:10 crc kubenswrapper[4555]: E1209 15:00:10.165923 4555 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 09 15:00:10 crc kubenswrapper[4555]: E1209 15:00:10.218010 4555 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:00:11 crc kubenswrapper[4555]: I1209 15:00:11.160331 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:11 crc kubenswrapper[4555]: E1209 15:00:11.160611 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:11 crc kubenswrapper[4555]: I1209 15:00:11.160496 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:11 crc kubenswrapper[4555]: I1209 15:00:11.160376 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:11 crc kubenswrapper[4555]: E1209 15:00:11.160851 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:11 crc kubenswrapper[4555]: E1209 15:00:11.161161 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:12 crc kubenswrapper[4555]: I1209 15:00:12.160385 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 15:00:12 crc kubenswrapper[4555]: E1209 15:00:12.160673 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 15:00:13 crc kubenswrapper[4555]: I1209 15:00:13.160167 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:13 crc kubenswrapper[4555]: I1209 15:00:13.160221 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:13 crc kubenswrapper[4555]: E1209 15:00:13.160266 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:13 crc kubenswrapper[4555]: I1209 15:00:13.160279 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:13 crc kubenswrapper[4555]: E1209 15:00:13.160389 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:13 crc kubenswrapper[4555]: E1209 15:00:13.160527 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:14 crc kubenswrapper[4555]: I1209 15:00:14.160536 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 15:00:14 crc kubenswrapper[4555]: E1209 15:00:14.160655 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 15:00:15 crc kubenswrapper[4555]: I1209 15:00:15.160886 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:15 crc kubenswrapper[4555]: E1209 15:00:15.161141 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:15 crc kubenswrapper[4555]: I1209 15:00:15.161260 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:15 crc kubenswrapper[4555]: E1209 15:00:15.161324 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:15 crc kubenswrapper[4555]: I1209 15:00:15.161444 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:15 crc kubenswrapper[4555]: E1209 15:00:15.161569 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:15 crc kubenswrapper[4555]: E1209 15:00:15.219370 4555 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:00:16 crc kubenswrapper[4555]: I1209 15:00:16.161034 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 15:00:16 crc kubenswrapper[4555]: E1209 15:00:16.161157 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 15:00:16 crc kubenswrapper[4555]: I1209 15:00:16.161389 4555 scope.go:117] "RemoveContainer" containerID="73a16b54312a5c81ee3d5b4753cd53c9a01cc7f174b340b1f0db44b019541fbd" Dec 09 15:00:16 crc kubenswrapper[4555]: I1209 15:00:16.532420 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gjjcm_3e34542c-29e8-43d3-add3-42aa080e0c8a/kube-multus/1.log" Dec 09 15:00:16 crc kubenswrapper[4555]: I1209 15:00:16.532463 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gjjcm" event={"ID":"3e34542c-29e8-43d3-add3-42aa080e0c8a","Type":"ContainerStarted","Data":"fddfb072bd733f5c700d405bf80dcbbe045961e870311c7880dc5d639fdb0059"} Dec 09 15:00:17 crc kubenswrapper[4555]: I1209 15:00:17.161050 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:17 crc kubenswrapper[4555]: I1209 15:00:17.161089 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:17 crc kubenswrapper[4555]: E1209 15:00:17.161150 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:17 crc kubenswrapper[4555]: I1209 15:00:17.161064 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:17 crc kubenswrapper[4555]: E1209 15:00:17.161216 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:17 crc kubenswrapper[4555]: E1209 15:00:17.161294 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:18 crc kubenswrapper[4555]: I1209 15:00:18.160635 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 15:00:18 crc kubenswrapper[4555]: E1209 15:00:18.160743 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 15:00:19 crc kubenswrapper[4555]: I1209 15:00:19.160551 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:19 crc kubenswrapper[4555]: I1209 15:00:19.160583 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:19 crc kubenswrapper[4555]: I1209 15:00:19.160588 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:19 crc kubenswrapper[4555]: E1209 15:00:19.160644 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:19 crc kubenswrapper[4555]: E1209 15:00:19.160769 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:19 crc kubenswrapper[4555]: E1209 15:00:19.160793 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:20 crc kubenswrapper[4555]: I1209 15:00:20.160482 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 15:00:20 crc kubenswrapper[4555]: E1209 15:00:20.161198 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sc2q" podUID="35f7e7bd-b667-42fa-b040-4a186e4570e3" Dec 09 15:00:21 crc kubenswrapper[4555]: I1209 15:00:21.160928 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:21 crc kubenswrapper[4555]: I1209 15:00:21.160979 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:21 crc kubenswrapper[4555]: I1209 15:00:21.161065 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:21 crc kubenswrapper[4555]: I1209 15:00:21.162650 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 09 15:00:21 crc kubenswrapper[4555]: I1209 15:00:21.162764 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 09 15:00:21 crc kubenswrapper[4555]: I1209 15:00:21.163036 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 09 15:00:21 crc kubenswrapper[4555]: I1209 15:00:21.163128 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.161137 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.162399 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.162437 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.928668 4555 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.951483 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-rf8rn"] Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.951837 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-rf8rn" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.952132 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5ncnf"] Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.952403 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-5ncnf" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.952657 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-8fb58"] Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.953055 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8fb58" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.953230 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6"] Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.953488 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.953814 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mk586"] Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.954141 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mk586" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.954702 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.954704 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xdwlv"] Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.954710 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.957930 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk"] Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.961087 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-vbr87"] Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.961144 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.961995 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-vbr87" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.962622 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.962822 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.962918 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.963007 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.963164 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.963274 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.963490 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.963507 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.963592 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.963654 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.963707 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.963771 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.963781 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.963858 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.963931 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.964129 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.964141 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.964228 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.964146 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.964280 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.964332 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.964363 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.964404 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.964429 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.965340 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.965469 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.965583 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.965885 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.967042 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.969337 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jxl9b"] Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.969719 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-llvq5"] Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.969923 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nqlbc"] Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.970098 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jxl9b" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.970155 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-nqlbc" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.970467 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-llvq5" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.970488 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wd6tn"] Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.970810 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.971137 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-5kplw"] Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.971531 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-5kplw" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.971635 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mtdft"] Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.972128 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtdft" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.972831 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.972975 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.973087 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.973357 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.973525 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.973641 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.975685 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.976211 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.980390 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.980632 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.980643 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.981314 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-vhncm"] Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.985054 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-srtkn"] Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.985473 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.985568 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.985765 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.985962 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.986108 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.986150 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.986197 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.986235 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.986275 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.986343 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.986416 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.986540 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.986623 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.986672 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.986680 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.986716 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.986777 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.986787 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.986852 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.986868 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.986929 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.986943 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.986988 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.987020 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.987056 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.987098 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.987120 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.987178 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.987211 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.987218 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.987249 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.987281 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.987315 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.987334 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.987183 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.987406 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.987469 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.989056 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.989708 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vhncm" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.990279 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.990426 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.990582 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-7drj7"] Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.990858 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gkckz"] Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.991059 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7h6sb"] Dec 09 15:00:22 crc kubenswrapper[4555]: I1209 15:00:22.991159 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-srtkn" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.000961 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.001409 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.001951 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.002213 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.002491 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-7drj7" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.003444 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzp94"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.003570 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gkckz" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.019877 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7h6sb" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.020637 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xbb6"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.020950 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-v5j6b"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.021244 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pw6lj"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.021576 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pw6lj" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.021738 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.021792 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2f6gc"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.022096 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzp94" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.022336 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f6gc" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.022387 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xbb6" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.022622 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-b27rg"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.022675 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.023344 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b27rg" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.024151 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.024496 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.024527 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.024573 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.024616 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.024641 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.024690 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.024701 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.024694 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.024795 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.024867 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.024880 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.025374 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.028619 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7trl6"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.029113 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-89bvk"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.029738 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hgkns"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.029901 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7trl6" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.030026 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-89bvk" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.030542 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hgkns" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.032427 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.033255 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.033544 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.034731 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gt6wf"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.035101 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gt6wf" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.037353 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wt67p"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.037739 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-pldb7"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.037983 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-srlsw"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.038269 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-srlsw" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.038718 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wt67p" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.038749 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-pldb7" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.039253 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4tvxr"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.044724 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-skgqs"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.045029 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-rf8rn"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.045052 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-r9tjc"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.045274 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-skgqs" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.045286 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4tvxr" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.045740 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421540-hhxsf"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.045808 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9tjc" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.046318 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-msfrj"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.046684 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rsbss"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.046727 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-hhxsf" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.053354 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mtdft"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.053405 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-vbr87"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.053422 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-89w6m"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.053885 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-msfrj" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.054261 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-89w6m" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.054279 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rsbss" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.055837 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.056048 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.056252 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzp94"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.063103 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-srtkn"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.063135 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xdwlv"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.065217 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wd6tn"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.065256 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-89bvk"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.065267 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5ncnf"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.067841 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mk586"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.067873 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-llvq5"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.067884 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jxl9b"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.069167 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7h6sb"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.071363 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wt67p"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.071533 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pw6lj"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.073607 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-vhncm"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.073645 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-b27rg"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.074132 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hgkns"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.076032 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.077561 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-v5j6b"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.077981 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-srlsw"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.078905 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2f6gc"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.079680 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nqlbc"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.080561 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7trl6"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.081235 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421540-hhxsf"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.081946 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61f20e90-b252-48ed-978a-8aec4b3d98ba-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7h6sb\" (UID: \"61f20e90-b252-48ed-978a-8aec4b3d98ba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7h6sb" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.082030 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gt6wf"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.082078 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86d9973d-84c0-4a1b-9e86-7609a79e4bfa-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-llvq5\" (UID: \"86d9973d-84c0-4a1b-9e86-7609a79e4bfa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-llvq5" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.082231 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cz49\" (UniqueName: \"kubernetes.io/projected/c16a9d27-0e19-4e91-ad2b-0a5e3948e19c-kube-api-access-7cz49\") pod \"machine-api-operator-5694c8668f-5ncnf\" (UID: \"c16a9d27-0e19-4e91-ad2b-0a5e3948e19c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5ncnf" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.082260 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dae2070b-58ae-47b7-8609-3ca457ed809a-serving-cert\") pod \"controller-manager-879f6c89f-rf8rn\" (UID: \"dae2070b-58ae-47b7-8609-3ca457ed809a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rf8rn" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.082275 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0f5603f-8e68-4cfb-9aa1-a6854b684172-serving-cert\") pod \"route-controller-manager-6576b87f9c-qdpr6\" (UID: \"d0f5603f-8e68-4cfb-9aa1-a6854b684172\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.082290 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86d9973d-84c0-4a1b-9e86-7609a79e4bfa-config\") pod \"kube-apiserver-operator-766d6c64bb-llvq5\" (UID: \"86d9973d-84c0-4a1b-9e86-7609a79e4bfa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-llvq5" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.082330 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dae2070b-58ae-47b7-8609-3ca457ed809a-config\") pod \"controller-manager-879f6c89f-rf8rn\" (UID: \"dae2070b-58ae-47b7-8609-3ca457ed809a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rf8rn" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.082343 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c16a9d27-0e19-4e91-ad2b-0a5e3948e19c-config\") pod \"machine-api-operator-5694c8668f-5ncnf\" (UID: \"c16a9d27-0e19-4e91-ad2b-0a5e3948e19c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5ncnf" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.082409 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c16a9d27-0e19-4e91-ad2b-0a5e3948e19c-images\") pod \"machine-api-operator-5694c8668f-5ncnf\" (UID: \"c16a9d27-0e19-4e91-ad2b-0a5e3948e19c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5ncnf" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.082442 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6lr5\" (UniqueName: \"kubernetes.io/projected/d0f5603f-8e68-4cfb-9aa1-a6854b684172-kube-api-access-v6lr5\") pod \"route-controller-manager-6576b87f9c-qdpr6\" (UID: \"d0f5603f-8e68-4cfb-9aa1-a6854b684172\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.082486 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c16a9d27-0e19-4e91-ad2b-0a5e3948e19c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5ncnf\" (UID: \"c16a9d27-0e19-4e91-ad2b-0a5e3948e19c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5ncnf" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.082509 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61f20e90-b252-48ed-978a-8aec4b3d98ba-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7h6sb\" (UID: \"61f20e90-b252-48ed-978a-8aec4b3d98ba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7h6sb" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.082544 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlmll\" (UniqueName: \"kubernetes.io/projected/61f20e90-b252-48ed-978a-8aec4b3d98ba-kube-api-access-vlmll\") pod \"kube-storage-version-migrator-operator-b67b599dd-7h6sb\" (UID: \"61f20e90-b252-48ed-978a-8aec4b3d98ba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7h6sb" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.082576 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0f5603f-8e68-4cfb-9aa1-a6854b684172-config\") pod \"route-controller-manager-6576b87f9c-qdpr6\" (UID: \"d0f5603f-8e68-4cfb-9aa1-a6854b684172\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.082621 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0f5603f-8e68-4cfb-9aa1-a6854b684172-client-ca\") pod \"route-controller-manager-6576b87f9c-qdpr6\" (UID: \"d0f5603f-8e68-4cfb-9aa1-a6854b684172\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.082636 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-257hz\" (UniqueName: \"kubernetes.io/projected/dae2070b-58ae-47b7-8609-3ca457ed809a-kube-api-access-257hz\") pod \"controller-manager-879f6c89f-rf8rn\" (UID: \"dae2070b-58ae-47b7-8609-3ca457ed809a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rf8rn" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.082674 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnd95\" (UniqueName: \"kubernetes.io/projected/f86821ee-dfd8-4090-bd6c-8f624a1362ae-kube-api-access-tnd95\") pod \"downloads-7954f5f757-vbr87\" (UID: \"f86821ee-dfd8-4090-bd6c-8f624a1362ae\") " pod="openshift-console/downloads-7954f5f757-vbr87" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.082689 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dae2070b-58ae-47b7-8609-3ca457ed809a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-rf8rn\" (UID: \"dae2070b-58ae-47b7-8609-3ca457ed809a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rf8rn" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.082702 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dae2070b-58ae-47b7-8609-3ca457ed809a-client-ca\") pod \"controller-manager-879f6c89f-rf8rn\" (UID: \"dae2070b-58ae-47b7-8609-3ca457ed809a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rf8rn" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.082729 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86d9973d-84c0-4a1b-9e86-7609a79e4bfa-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-llvq5\" (UID: \"86d9973d-84c0-4a1b-9e86-7609a79e4bfa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-llvq5" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.082781 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gkckz"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.083646 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xbb6"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.084268 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-msfrj"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.085095 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-7drj7"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.086015 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-skgqs"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.086887 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-r9tjc"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.087734 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-5kplw"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.088615 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.089409 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rsbss"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.090172 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-89w6m"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.091542 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-whvjt"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.092090 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2wsmx"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.092179 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-whvjt" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.092922 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.093007 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-2wsmx" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.093228 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4tvxr"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.094073 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2wsmx"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.112839 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.133027 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.153284 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.161971 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-9l7sb"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.162514 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9l7sb" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.169343 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9l7sb"] Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.173655 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.183395 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnd95\" (UniqueName: \"kubernetes.io/projected/f86821ee-dfd8-4090-bd6c-8f624a1362ae-kube-api-access-tnd95\") pod \"downloads-7954f5f757-vbr87\" (UID: \"f86821ee-dfd8-4090-bd6c-8f624a1362ae\") " pod="openshift-console/downloads-7954f5f757-vbr87" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.183424 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dae2070b-58ae-47b7-8609-3ca457ed809a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-rf8rn\" (UID: \"dae2070b-58ae-47b7-8609-3ca457ed809a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rf8rn" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.183444 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dae2070b-58ae-47b7-8609-3ca457ed809a-client-ca\") pod \"controller-manager-879f6c89f-rf8rn\" (UID: \"dae2070b-58ae-47b7-8609-3ca457ed809a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rf8rn" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.183469 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86d9973d-84c0-4a1b-9e86-7609a79e4bfa-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-llvq5\" (UID: \"86d9973d-84c0-4a1b-9e86-7609a79e4bfa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-llvq5" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.183490 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61f20e90-b252-48ed-978a-8aec4b3d98ba-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7h6sb\" (UID: \"61f20e90-b252-48ed-978a-8aec4b3d98ba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7h6sb" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.183514 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86d9973d-84c0-4a1b-9e86-7609a79e4bfa-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-llvq5\" (UID: \"86d9973d-84c0-4a1b-9e86-7609a79e4bfa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-llvq5" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.183529 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dae2070b-58ae-47b7-8609-3ca457ed809a-serving-cert\") pod \"controller-manager-879f6c89f-rf8rn\" (UID: \"dae2070b-58ae-47b7-8609-3ca457ed809a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rf8rn" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.183558 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cz49\" (UniqueName: \"kubernetes.io/projected/c16a9d27-0e19-4e91-ad2b-0a5e3948e19c-kube-api-access-7cz49\") pod \"machine-api-operator-5694c8668f-5ncnf\" (UID: \"c16a9d27-0e19-4e91-ad2b-0a5e3948e19c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5ncnf" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.183589 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0f5603f-8e68-4cfb-9aa1-a6854b684172-serving-cert\") pod \"route-controller-manager-6576b87f9c-qdpr6\" (UID: \"d0f5603f-8e68-4cfb-9aa1-a6854b684172\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.183634 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86d9973d-84c0-4a1b-9e86-7609a79e4bfa-config\") pod \"kube-apiserver-operator-766d6c64bb-llvq5\" (UID: \"86d9973d-84c0-4a1b-9e86-7609a79e4bfa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-llvq5" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.183660 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dae2070b-58ae-47b7-8609-3ca457ed809a-config\") pod \"controller-manager-879f6c89f-rf8rn\" (UID: \"dae2070b-58ae-47b7-8609-3ca457ed809a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rf8rn" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.183684 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c16a9d27-0e19-4e91-ad2b-0a5e3948e19c-config\") pod \"machine-api-operator-5694c8668f-5ncnf\" (UID: \"c16a9d27-0e19-4e91-ad2b-0a5e3948e19c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5ncnf" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.183713 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c16a9d27-0e19-4e91-ad2b-0a5e3948e19c-images\") pod \"machine-api-operator-5694c8668f-5ncnf\" (UID: \"c16a9d27-0e19-4e91-ad2b-0a5e3948e19c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5ncnf" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.183744 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6lr5\" (UniqueName: \"kubernetes.io/projected/d0f5603f-8e68-4cfb-9aa1-a6854b684172-kube-api-access-v6lr5\") pod \"route-controller-manager-6576b87f9c-qdpr6\" (UID: \"d0f5603f-8e68-4cfb-9aa1-a6854b684172\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.183778 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c16a9d27-0e19-4e91-ad2b-0a5e3948e19c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5ncnf\" (UID: \"c16a9d27-0e19-4e91-ad2b-0a5e3948e19c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5ncnf" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.183810 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61f20e90-b252-48ed-978a-8aec4b3d98ba-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7h6sb\" (UID: \"61f20e90-b252-48ed-978a-8aec4b3d98ba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7h6sb" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.183840 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlmll\" (UniqueName: \"kubernetes.io/projected/61f20e90-b252-48ed-978a-8aec4b3d98ba-kube-api-access-vlmll\") pod \"kube-storage-version-migrator-operator-b67b599dd-7h6sb\" (UID: \"61f20e90-b252-48ed-978a-8aec4b3d98ba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7h6sb" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.183872 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0f5603f-8e68-4cfb-9aa1-a6854b684172-config\") pod \"route-controller-manager-6576b87f9c-qdpr6\" (UID: \"d0f5603f-8e68-4cfb-9aa1-a6854b684172\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.183909 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0f5603f-8e68-4cfb-9aa1-a6854b684172-client-ca\") pod \"route-controller-manager-6576b87f9c-qdpr6\" (UID: \"d0f5603f-8e68-4cfb-9aa1-a6854b684172\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.183935 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-257hz\" (UniqueName: \"kubernetes.io/projected/dae2070b-58ae-47b7-8609-3ca457ed809a-kube-api-access-257hz\") pod \"controller-manager-879f6c89f-rf8rn\" (UID: \"dae2070b-58ae-47b7-8609-3ca457ed809a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rf8rn" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.185010 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61f20e90-b252-48ed-978a-8aec4b3d98ba-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7h6sb\" (UID: \"61f20e90-b252-48ed-978a-8aec4b3d98ba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7h6sb" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.185561 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dae2070b-58ae-47b7-8609-3ca457ed809a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-rf8rn\" (UID: \"dae2070b-58ae-47b7-8609-3ca457ed809a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rf8rn" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.185756 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c16a9d27-0e19-4e91-ad2b-0a5e3948e19c-config\") pod \"machine-api-operator-5694c8668f-5ncnf\" (UID: \"c16a9d27-0e19-4e91-ad2b-0a5e3948e19c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5ncnf" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.186211 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0f5603f-8e68-4cfb-9aa1-a6854b684172-client-ca\") pod \"route-controller-manager-6576b87f9c-qdpr6\" (UID: \"d0f5603f-8e68-4cfb-9aa1-a6854b684172\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.186235 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c16a9d27-0e19-4e91-ad2b-0a5e3948e19c-images\") pod \"machine-api-operator-5694c8668f-5ncnf\" (UID: \"c16a9d27-0e19-4e91-ad2b-0a5e3948e19c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5ncnf" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.186415 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0f5603f-8e68-4cfb-9aa1-a6854b684172-config\") pod \"route-controller-manager-6576b87f9c-qdpr6\" (UID: \"d0f5603f-8e68-4cfb-9aa1-a6854b684172\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.187634 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86d9973d-84c0-4a1b-9e86-7609a79e4bfa-config\") pod \"kube-apiserver-operator-766d6c64bb-llvq5\" (UID: \"86d9973d-84c0-4a1b-9e86-7609a79e4bfa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-llvq5" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.188476 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dae2070b-58ae-47b7-8609-3ca457ed809a-client-ca\") pod \"controller-manager-879f6c89f-rf8rn\" (UID: \"dae2070b-58ae-47b7-8609-3ca457ed809a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rf8rn" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.188918 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dae2070b-58ae-47b7-8609-3ca457ed809a-config\") pod \"controller-manager-879f6c89f-rf8rn\" (UID: \"dae2070b-58ae-47b7-8609-3ca457ed809a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rf8rn" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.189534 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86d9973d-84c0-4a1b-9e86-7609a79e4bfa-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-llvq5\" (UID: \"86d9973d-84c0-4a1b-9e86-7609a79e4bfa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-llvq5" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.189859 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c16a9d27-0e19-4e91-ad2b-0a5e3948e19c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5ncnf\" (UID: \"c16a9d27-0e19-4e91-ad2b-0a5e3948e19c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5ncnf" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.190980 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dae2070b-58ae-47b7-8609-3ca457ed809a-serving-cert\") pod \"controller-manager-879f6c89f-rf8rn\" (UID: \"dae2070b-58ae-47b7-8609-3ca457ed809a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rf8rn" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.191134 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0f5603f-8e68-4cfb-9aa1-a6854b684172-serving-cert\") pod \"route-controller-manager-6576b87f9c-qdpr6\" (UID: \"d0f5603f-8e68-4cfb-9aa1-a6854b684172\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.192835 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61f20e90-b252-48ed-978a-8aec4b3d98ba-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7h6sb\" (UID: \"61f20e90-b252-48ed-978a-8aec4b3d98ba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7h6sb" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.193571 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.213094 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.233434 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.253153 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.273314 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.293136 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.322979 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.333853 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.356613 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.372985 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.393314 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.413669 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.433946 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.457963 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.473882 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.493060 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.513644 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.533829 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.553896 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.573314 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.593736 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.613477 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.633787 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.658742 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.673787 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.693086 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.713316 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.733746 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.752794 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.773151 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.793517 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.813876 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.833172 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.853444 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.873123 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.893027 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.913141 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.973428 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 09 15:00:23 crc kubenswrapper[4555]: I1209 15:00:23.993663 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.014122 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.032331 4555 request.go:700] Waited for 1.002113977s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.033279 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.053373 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.074125 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.093201 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.113239 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.133465 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.153143 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.172893 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.193548 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.213529 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.233319 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.253991 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.277382 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.293662 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.313619 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.332905 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.353050 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.373018 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.393082 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.413167 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.433719 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.453960 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.472867 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.493201 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.513618 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.534287 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.553351 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.573513 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.594197 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.613505 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.634080 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.653493 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.673670 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.693970 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.713253 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.733573 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.753825 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.773244 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.793667 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.813267 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.833085 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.853246 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.873441 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.893710 4555 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.913920 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.933469 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.953992 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.973046 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 09 15:00:24 crc kubenswrapper[4555]: I1209 15:00:24.993398 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.024949 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnd95\" (UniqueName: \"kubernetes.io/projected/f86821ee-dfd8-4090-bd6c-8f624a1362ae-kube-api-access-tnd95\") pod \"downloads-7954f5f757-vbr87\" (UID: \"f86821ee-dfd8-4090-bd6c-8f624a1362ae\") " pod="openshift-console/downloads-7954f5f757-vbr87" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.032660 4555 request.go:700] Waited for 1.848560221s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.045024 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-257hz\" (UniqueName: \"kubernetes.io/projected/dae2070b-58ae-47b7-8609-3ca457ed809a-kube-api-access-257hz\") pod \"controller-manager-879f6c89f-rf8rn\" (UID: \"dae2070b-58ae-47b7-8609-3ca457ed809a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rf8rn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.063821 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlmll\" (UniqueName: \"kubernetes.io/projected/61f20e90-b252-48ed-978a-8aec4b3d98ba-kube-api-access-vlmll\") pod \"kube-storage-version-migrator-operator-b67b599dd-7h6sb\" (UID: \"61f20e90-b252-48ed-978a-8aec4b3d98ba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7h6sb" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.070840 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-rf8rn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.084020 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6lr5\" (UniqueName: \"kubernetes.io/projected/d0f5603f-8e68-4cfb-9aa1-a6854b684172-kube-api-access-v6lr5\") pod \"route-controller-manager-6576b87f9c-qdpr6\" (UID: \"d0f5603f-8e68-4cfb-9aa1-a6854b684172\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.103044 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86d9973d-84c0-4a1b-9e86-7609a79e4bfa-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-llvq5\" (UID: \"86d9973d-84c0-4a1b-9e86-7609a79e4bfa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-llvq5" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.121334 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.124096 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cz49\" (UniqueName: \"kubernetes.io/projected/c16a9d27-0e19-4e91-ad2b-0a5e3948e19c-kube-api-access-7cz49\") pod \"machine-api-operator-5694c8668f-5ncnf\" (UID: \"c16a9d27-0e19-4e91-ad2b-0a5e3948e19c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5ncnf" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.155705 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-vbr87" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.188062 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-llvq5" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.192141 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-rf8rn"] Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.197809 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eee41022-2a9c-41c8-bbc8-ff42874c4050-trusted-ca\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.197832 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbl9s\" (UniqueName: \"kubernetes.io/projected/b5a1d5a2-358f-4445-9253-5a9bc8b9daf3-kube-api-access-xbl9s\") pod \"openshift-controller-manager-operator-756b6f6bc6-wzp94\" (UID: \"b5a1d5a2-358f-4445-9253-5a9bc8b9daf3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzp94" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.197851 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8jwt\" (UniqueName: \"kubernetes.io/projected/77dac49a-34a8-421c-99ab-1fa4a08fc6fa-kube-api-access-p8jwt\") pod \"openshift-config-operator-7777fb866f-mtdft\" (UID: \"77dac49a-34a8-421c-99ab-1fa4a08fc6fa\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtdft" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.197875 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d15dddf-4a72-4590-9def-755a9fd11da6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-5kplw\" (UID: \"2d15dddf-4a72-4590-9def-755a9fd11da6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5kplw" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.197890 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d15dddf-4a72-4590-9def-755a9fd11da6-config\") pod \"authentication-operator-69f744f599-5kplw\" (UID: \"2d15dddf-4a72-4590-9def-755a9fd11da6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5kplw" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.197904 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7fc2be6-0cb9-4fad-ac58-37004264a563-console-serving-cert\") pod \"console-f9d7485db-vhncm\" (UID: \"a7fc2be6-0cb9-4fad-ac58-37004264a563\") " pod="openshift-console/console-f9d7485db-vhncm" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.197924 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/897e7b14-cb3a-40e7-a127-587bc5e9502a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gkckz\" (UID: \"897e7b14-cb3a-40e7-a127-587bc5e9502a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gkckz" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.197939 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.197954 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dlj7\" (UniqueName: \"kubernetes.io/projected/e40fc0c2-28f1-444d-8ff4-8f11faa1db2b-kube-api-access-9dlj7\") pod \"etcd-operator-b45778765-nqlbc\" (UID: \"e40fc0c2-28f1-444d-8ff4-8f11faa1db2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nqlbc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.197967 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eee41022-2a9c-41c8-bbc8-ff42874c4050-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.197979 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e376f745-76d4-42dd-97f7-4554c02eec74-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-pw6lj\" (UID: \"e376f745-76d4-42dd-97f7-4554c02eec74\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pw6lj" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.197991 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e40fc0c2-28f1-444d-8ff4-8f11faa1db2b-etcd-client\") pod \"etcd-operator-b45778765-nqlbc\" (UID: \"e40fc0c2-28f1-444d-8ff4-8f11faa1db2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nqlbc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.198005 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9krkk\" (UniqueName: \"kubernetes.io/projected/6f6a1efb-6c81-45a4-91e2-9f67f8a266be-kube-api-access-9krkk\") pod \"service-ca-operator-777779d784-b27rg\" (UID: \"6f6a1efb-6c81-45a4-91e2-9f67f8a266be\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b27rg" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.198044 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnh65\" (UniqueName: \"kubernetes.io/projected/0a4f7efd-8434-48a7-979e-9f128e93797b-kube-api-access-nnh65\") pod \"cluster-samples-operator-665b6dd947-jxl9b\" (UID: \"0a4f7efd-8434-48a7-979e-9f128e93797b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jxl9b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.198058 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.198105 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f6a1efb-6c81-45a4-91e2-9f67f8a266be-serving-cert\") pod \"service-ca-operator-777779d784-b27rg\" (UID: \"6f6a1efb-6c81-45a4-91e2-9f67f8a266be\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b27rg" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.198171 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.198189 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7fc2be6-0cb9-4fad-ac58-37004264a563-trusted-ca-bundle\") pod \"console-f9d7485db-vhncm\" (UID: \"a7fc2be6-0cb9-4fad-ac58-37004264a563\") " pod="openshift-console/console-f9d7485db-vhncm" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.198210 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e40fc0c2-28f1-444d-8ff4-8f11faa1db2b-etcd-ca\") pod \"etcd-operator-b45778765-nqlbc\" (UID: \"e40fc0c2-28f1-444d-8ff4-8f11faa1db2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nqlbc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.198225 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a7fc2be6-0cb9-4fad-ac58-37004264a563-console-config\") pod \"console-f9d7485db-vhncm\" (UID: \"a7fc2be6-0cb9-4fad-ac58-37004264a563\") " pod="openshift-console/console-f9d7485db-vhncm" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.198376 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7402585-182c-46a1-a8ac-257c97421c8d-serving-cert\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.198401 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a7fc2be6-0cb9-4fad-ac58-37004264a563-console-oauth-config\") pod \"console-f9d7485db-vhncm\" (UID: \"a7fc2be6-0cb9-4fad-ac58-37004264a563\") " pod="openshift-console/console-f9d7485db-vhncm" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.198417 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eee41022-2a9c-41c8-bbc8-ff42874c4050-registry-tls\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.198474 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5a1d5a2-358f-4445-9253-5a9bc8b9daf3-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wzp94\" (UID: \"b5a1d5a2-358f-4445-9253-5a9bc8b9daf3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzp94" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.198501 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/13cf2230-0cbf-49b1-92f4-b49b8d72acec-audit-policies\") pod \"apiserver-7bbb656c7d-5xggk\" (UID: \"13cf2230-0cbf-49b1-92f4-b49b8d72acec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.198515 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/13cf2230-0cbf-49b1-92f4-b49b8d72acec-audit-dir\") pod \"apiserver-7bbb656c7d-5xggk\" (UID: \"13cf2230-0cbf-49b1-92f4-b49b8d72acec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.198530 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b7402585-182c-46a1-a8ac-257c97421c8d-encryption-config\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.198543 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/89fdb2fe-9563-4934-82c8-10c218d81bea-audit-policies\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.198559 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0a4f7efd-8434-48a7-979e-9f128e93797b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jxl9b\" (UID: \"0a4f7efd-8434-48a7-979e-9f128e93797b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jxl9b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.198571 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7402585-182c-46a1-a8ac-257c97421c8d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.198594 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/adaadc75-0b7d-4699-8d61-fe5ebb7dc84a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-9xbb6\" (UID: \"adaadc75-0b7d-4699-8d61-fe5ebb7dc84a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xbb6" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.198609 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13cf2230-0cbf-49b1-92f4-b49b8d72acec-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5xggk\" (UID: \"13cf2230-0cbf-49b1-92f4-b49b8d72acec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.198624 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eee41022-2a9c-41c8-bbc8-ff42874c4050-registry-certificates\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.198664 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/897e7b14-cb3a-40e7-a127-587bc5e9502a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gkckz\" (UID: \"897e7b14-cb3a-40e7-a127-587bc5e9502a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gkckz" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.198691 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d15dddf-4a72-4590-9def-755a9fd11da6-serving-cert\") pod \"authentication-operator-69f744f599-5kplw\" (UID: \"2d15dddf-4a72-4590-9def-755a9fd11da6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5kplw" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.198709 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ce81b97-c663-47e5-b75e-ba3333d5cb9a-config\") pod \"machine-approver-56656f9798-8fb58\" (UID: \"7ce81b97-c663-47e5-b75e-ba3333d5cb9a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8fb58" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.198787 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jjbm\" (UniqueName: \"kubernetes.io/projected/13cf2230-0cbf-49b1-92f4-b49b8d72acec-kube-api-access-6jjbm\") pod \"apiserver-7bbb656c7d-5xggk\" (UID: \"13cf2230-0cbf-49b1-92f4-b49b8d72acec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.198874 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6698v\" (UniqueName: \"kubernetes.io/projected/2d15dddf-4a72-4590-9def-755a9fd11da6-kube-api-access-6698v\") pod \"authentication-operator-69f744f599-5kplw\" (UID: \"2d15dddf-4a72-4590-9def-755a9fd11da6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5kplw" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.199222 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b87660e6-af7f-42a9-bdb5-7bee97dab52f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mk586\" (UID: \"b87660e6-af7f-42a9-bdb5-7bee97dab52f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mk586" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.199253 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/adaadc75-0b7d-4699-8d61-fe5ebb7dc84a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-9xbb6\" (UID: \"adaadc75-0b7d-4699-8d61-fe5ebb7dc84a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xbb6" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.199328 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqd8z\" (UniqueName: \"kubernetes.io/projected/b87660e6-af7f-42a9-bdb5-7bee97dab52f-kube-api-access-fqd8z\") pod \"openshift-apiserver-operator-796bbdcf4f-mk586\" (UID: \"b87660e6-af7f-42a9-bdb5-7bee97dab52f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mk586" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.199367 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b7402585-182c-46a1-a8ac-257c97421c8d-image-import-ca\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.199393 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns6z2\" (UniqueName: \"kubernetes.io/projected/a7fc2be6-0cb9-4fad-ac58-37004264a563-kube-api-access-ns6z2\") pod \"console-f9d7485db-vhncm\" (UID: \"a7fc2be6-0cb9-4fad-ac58-37004264a563\") " pod="openshift-console/console-f9d7485db-vhncm" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.199412 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7ce81b97-c663-47e5-b75e-ba3333d5cb9a-machine-approver-tls\") pod \"machine-approver-56656f9798-8fb58\" (UID: \"7ce81b97-c663-47e5-b75e-ba3333d5cb9a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8fb58" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.199481 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b7402585-182c-46a1-a8ac-257c97421c8d-node-pullsecrets\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.199503 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b7402585-182c-46a1-a8ac-257c97421c8d-etcd-serving-ca\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.199523 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5446f927-f240-4554-832a-28b950424eea-metrics-tls\") pod \"dns-operator-744455d44c-srtkn\" (UID: \"5446f927-f240-4554-832a-28b950424eea\") " pod="openshift-dns-operator/dns-operator-744455d44c-srtkn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.199539 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.199563 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.199588 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.199601 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrqkx\" (UniqueName: \"kubernetes.io/projected/89fdb2fe-9563-4934-82c8-10c218d81bea-kube-api-access-xrqkx\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.199614 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77dac49a-34a8-421c-99ab-1fa4a08fc6fa-serving-cert\") pod \"openshift-config-operator-7777fb866f-mtdft\" (UID: \"77dac49a-34a8-421c-99ab-1fa4a08fc6fa\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtdft" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.199641 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.199655 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/13cf2230-0cbf-49b1-92f4-b49b8d72acec-etcd-client\") pod \"apiserver-7bbb656c7d-5xggk\" (UID: \"13cf2230-0cbf-49b1-92f4-b49b8d72acec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.199674 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5a1d5a2-358f-4445-9253-5a9bc8b9daf3-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wzp94\" (UID: \"b5a1d5a2-358f-4445-9253-5a9bc8b9daf3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzp94" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200093 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e376f745-76d4-42dd-97f7-4554c02eec74-config\") pod \"kube-controller-manager-operator-78b949d7b-pw6lj\" (UID: \"e376f745-76d4-42dd-97f7-4554c02eec74\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pw6lj" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200116 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3f2ca3d-d5ad-4a22-96d4-3bc4c5eb7a96-trusted-ca\") pod \"ingress-operator-5b745b69d9-2f6gc\" (UID: \"d3f2ca3d-d5ad-4a22-96d4-3bc4c5eb7a96\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f6gc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200131 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b7402585-182c-46a1-a8ac-257c97421c8d-audit\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200143 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/89fdb2fe-9563-4934-82c8-10c218d81bea-audit-dir\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200167 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200181 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/adaadc75-0b7d-4699-8d61-fe5ebb7dc84a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-9xbb6\" (UID: \"adaadc75-0b7d-4699-8d61-fe5ebb7dc84a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xbb6" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200197 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5926dc72-d5e9-4c49-afba-ac1ce69678be-serving-cert\") pod \"console-operator-58897d9998-7drj7\" (UID: \"5926dc72-d5e9-4c49-afba-ac1ce69678be\") " pod="openshift-console-operator/console-operator-58897d9998-7drj7" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200211 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv9fc\" (UniqueName: \"kubernetes.io/projected/7ce81b97-c663-47e5-b75e-ba3333d5cb9a-kube-api-access-nv9fc\") pod \"machine-approver-56656f9798-8fb58\" (UID: \"7ce81b97-c663-47e5-b75e-ba3333d5cb9a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8fb58" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200337 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eee41022-2a9c-41c8-bbc8-ff42874c4050-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200362 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f6a1efb-6c81-45a4-91e2-9f67f8a266be-config\") pod \"service-ca-operator-777779d784-b27rg\" (UID: \"6f6a1efb-6c81-45a4-91e2-9f67f8a266be\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b27rg" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200379 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5926dc72-d5e9-4c49-afba-ac1ce69678be-config\") pod \"console-operator-58897d9998-7drj7\" (UID: \"5926dc72-d5e9-4c49-afba-ac1ce69678be\") " pod="openshift-console-operator/console-operator-58897d9998-7drj7" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200394 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3f2ca3d-d5ad-4a22-96d4-3bc4c5eb7a96-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2f6gc\" (UID: \"d3f2ca3d-d5ad-4a22-96d4-3bc4c5eb7a96\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f6gc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200409 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx7h6\" (UniqueName: \"kubernetes.io/projected/d3f2ca3d-d5ad-4a22-96d4-3bc4c5eb7a96-kube-api-access-sx7h6\") pod \"ingress-operator-5b745b69d9-2f6gc\" (UID: \"d3f2ca3d-d5ad-4a22-96d4-3bc4c5eb7a96\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f6gc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200423 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200448 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7402585-182c-46a1-a8ac-257c97421c8d-config\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200475 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/13cf2230-0cbf-49b1-92f4-b49b8d72acec-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5xggk\" (UID: \"13cf2230-0cbf-49b1-92f4-b49b8d72acec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200489 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmt28\" (UniqueName: \"kubernetes.io/projected/adaadc75-0b7d-4699-8d61-fe5ebb7dc84a-kube-api-access-gmt28\") pod \"cluster-image-registry-operator-dc59b4c8b-9xbb6\" (UID: \"adaadc75-0b7d-4699-8d61-fe5ebb7dc84a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xbb6" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200505 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3f2ca3d-d5ad-4a22-96d4-3bc4c5eb7a96-metrics-tls\") pod \"ingress-operator-5b745b69d9-2f6gc\" (UID: \"d3f2ca3d-d5ad-4a22-96d4-3bc4c5eb7a96\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f6gc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200519 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13cf2230-0cbf-49b1-92f4-b49b8d72acec-serving-cert\") pod \"apiserver-7bbb656c7d-5xggk\" (UID: \"13cf2230-0cbf-49b1-92f4-b49b8d72acec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200533 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b87660e6-af7f-42a9-bdb5-7bee97dab52f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mk586\" (UID: \"b87660e6-af7f-42a9-bdb5-7bee97dab52f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mk586" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200550 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d15dddf-4a72-4590-9def-755a9fd11da6-service-ca-bundle\") pod \"authentication-operator-69f744f599-5kplw\" (UID: \"2d15dddf-4a72-4590-9def-755a9fd11da6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5kplw" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200564 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpcg2\" (UniqueName: \"kubernetes.io/projected/b7402585-182c-46a1-a8ac-257c97421c8d-kube-api-access-cpcg2\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200576 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/897e7b14-cb3a-40e7-a127-587bc5e9502a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gkckz\" (UID: \"897e7b14-cb3a-40e7-a127-587bc5e9502a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gkckz" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200590 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e40fc0c2-28f1-444d-8ff4-8f11faa1db2b-serving-cert\") pod \"etcd-operator-b45778765-nqlbc\" (UID: \"e40fc0c2-28f1-444d-8ff4-8f11faa1db2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nqlbc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200609 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200624 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5926dc72-d5e9-4c49-afba-ac1ce69678be-trusted-ca\") pod \"console-operator-58897d9998-7drj7\" (UID: \"5926dc72-d5e9-4c49-afba-ac1ce69678be\") " pod="openshift-console-operator/console-operator-58897d9998-7drj7" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200637 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a7fc2be6-0cb9-4fad-ac58-37004264a563-service-ca\") pod \"console-f9d7485db-vhncm\" (UID: \"a7fc2be6-0cb9-4fad-ac58-37004264a563\") " pod="openshift-console/console-f9d7485db-vhncm" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200650 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200664 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e40fc0c2-28f1-444d-8ff4-8f11faa1db2b-etcd-service-ca\") pod \"etcd-operator-b45778765-nqlbc\" (UID: \"e40fc0c2-28f1-444d-8ff4-8f11faa1db2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nqlbc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200677 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b7402585-182c-46a1-a8ac-257c97421c8d-etcd-client\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200690 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e376f745-76d4-42dd-97f7-4554c02eec74-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-pw6lj\" (UID: \"e376f745-76d4-42dd-97f7-4554c02eec74\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pw6lj" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200703 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eee41022-2a9c-41c8-bbc8-ff42874c4050-bound-sa-token\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200717 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a7fc2be6-0cb9-4fad-ac58-37004264a563-oauth-serving-cert\") pod \"console-f9d7485db-vhncm\" (UID: \"a7fc2be6-0cb9-4fad-ac58-37004264a563\") " pod="openshift-console/console-f9d7485db-vhncm" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200738 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/13cf2230-0cbf-49b1-92f4-b49b8d72acec-encryption-config\") pod \"apiserver-7bbb656c7d-5xggk\" (UID: \"13cf2230-0cbf-49b1-92f4-b49b8d72acec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200752 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwzpz\" (UniqueName: \"kubernetes.io/projected/5446f927-f240-4554-832a-28b950424eea-kube-api-access-qwzpz\") pod \"dns-operator-744455d44c-srtkn\" (UID: \"5446f927-f240-4554-832a-28b950424eea\") " pod="openshift-dns-operator/dns-operator-744455d44c-srtkn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200766 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnspm\" (UniqueName: \"kubernetes.io/projected/5926dc72-d5e9-4c49-afba-ac1ce69678be-kube-api-access-dnspm\") pod \"console-operator-58897d9998-7drj7\" (UID: \"5926dc72-d5e9-4c49-afba-ac1ce69678be\") " pod="openshift-console-operator/console-operator-58897d9998-7drj7" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200780 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/77dac49a-34a8-421c-99ab-1fa4a08fc6fa-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mtdft\" (UID: \"77dac49a-34a8-421c-99ab-1fa4a08fc6fa\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtdft" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200792 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn2kc\" (UniqueName: \"kubernetes.io/projected/eee41022-2a9c-41c8-bbc8-ff42874c4050-kube-api-access-dn2kc\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200805 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7ce81b97-c663-47e5-b75e-ba3333d5cb9a-auth-proxy-config\") pod \"machine-approver-56656f9798-8fb58\" (UID: \"7ce81b97-c663-47e5-b75e-ba3333d5cb9a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8fb58" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200816 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b7402585-182c-46a1-a8ac-257c97421c8d-audit-dir\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200829 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e40fc0c2-28f1-444d-8ff4-8f11faa1db2b-config\") pod \"etcd-operator-b45778765-nqlbc\" (UID: \"e40fc0c2-28f1-444d-8ff4-8f11faa1db2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nqlbc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.200844 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: E1209 15:00:25.201146 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:25.701136426 +0000 UTC m=+135.837849144 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.226518 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7h6sb" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.229795 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6"] Dec 09 15:00:25 crc kubenswrapper[4555]: W1209 15:00:25.253158 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0f5603f_8e68_4cfb_9aa1_a6854b684172.slice/crio-2e194e8279b55b50ed2f0451d6c3122ce427b731c639a95d8944879b378b1e91 WatchSource:0}: Error finding container 2e194e8279b55b50ed2f0451d6c3122ce427b731c639a95d8944879b378b1e91: Status 404 returned error can't find the container with id 2e194e8279b55b50ed2f0451d6c3122ce427b731c639a95d8944879b378b1e91 Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.281816 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-vbr87"] Dec 09 15:00:25 crc kubenswrapper[4555]: W1209 15:00:25.297624 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf86821ee_dfd8_4090_bd6c_8f624a1362ae.slice/crio-b77f95d494f228666f8790acd98fd9284669a6612ee5c458b2667802235ba435 WatchSource:0}: Error finding container b77f95d494f228666f8790acd98fd9284669a6612ee5c458b2667802235ba435: Status 404 returned error can't find the container with id b77f95d494f228666f8790acd98fd9284669a6612ee5c458b2667802235ba435 Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.301703 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.301952 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d73470df-6b49-4a9f-bcd8-a0acac5cb921-proxy-tls\") pod \"machine-config-operator-74547568cd-r9tjc\" (UID: \"d73470df-6b49-4a9f-bcd8-a0acac5cb921\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9tjc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.301993 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/897e7b14-cb3a-40e7-a127-587bc5e9502a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gkckz\" (UID: \"897e7b14-cb3a-40e7-a127-587bc5e9502a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gkckz" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302009 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302024 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dlj7\" (UniqueName: \"kubernetes.io/projected/e40fc0c2-28f1-444d-8ff4-8f11faa1db2b-kube-api-access-9dlj7\") pod \"etcd-operator-b45778765-nqlbc\" (UID: \"e40fc0c2-28f1-444d-8ff4-8f11faa1db2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nqlbc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302040 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1908bc6b-523d-4ffd-8e4c-e73f3e3b7ee3-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-msfrj\" (UID: \"1908bc6b-523d-4ffd-8e4c-e73f3e3b7ee3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-msfrj" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302056 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33ad6890-1e95-4a46-9f40-00cf0443d4a1-service-ca-bundle\") pod \"router-default-5444994796-pldb7\" (UID: \"33ad6890-1e95-4a46-9f40-00cf0443d4a1\") " pod="openshift-ingress/router-default-5444994796-pldb7" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302071 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eee41022-2a9c-41c8-bbc8-ff42874c4050-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302085 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e376f745-76d4-42dd-97f7-4554c02eec74-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-pw6lj\" (UID: \"e376f745-76d4-42dd-97f7-4554c02eec74\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pw6lj" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302098 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e40fc0c2-28f1-444d-8ff4-8f11faa1db2b-etcd-client\") pod \"etcd-operator-b45778765-nqlbc\" (UID: \"e40fc0c2-28f1-444d-8ff4-8f11faa1db2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nqlbc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302112 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302127 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9krkk\" (UniqueName: \"kubernetes.io/projected/6f6a1efb-6c81-45a4-91e2-9f67f8a266be-kube-api-access-9krkk\") pod \"service-ca-operator-777779d784-b27rg\" (UID: \"6f6a1efb-6c81-45a4-91e2-9f67f8a266be\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b27rg" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302141 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnh65\" (UniqueName: \"kubernetes.io/projected/0a4f7efd-8434-48a7-979e-9f128e93797b-kube-api-access-nnh65\") pod \"cluster-samples-operator-665b6dd947-jxl9b\" (UID: \"0a4f7efd-8434-48a7-979e-9f128e93797b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jxl9b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302154 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f6a1efb-6c81-45a4-91e2-9f67f8a266be-serving-cert\") pod \"service-ca-operator-777779d784-b27rg\" (UID: \"6f6a1efb-6c81-45a4-91e2-9f67f8a266be\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b27rg" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302166 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302179 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1b1ecba-4701-4e11-b2e9-1327cc2c5c2d-cert\") pod \"ingress-canary-89w6m\" (UID: \"c1b1ecba-4701-4e11-b2e9-1327cc2c5c2d\") " pod="openshift-ingress-canary/ingress-canary-89w6m" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302193 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a7fc2be6-0cb9-4fad-ac58-37004264a563-console-config\") pod \"console-f9d7485db-vhncm\" (UID: \"a7fc2be6-0cb9-4fad-ac58-37004264a563\") " pod="openshift-console/console-f9d7485db-vhncm" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302206 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7fc2be6-0cb9-4fad-ac58-37004264a563-trusted-ca-bundle\") pod \"console-f9d7485db-vhncm\" (UID: \"a7fc2be6-0cb9-4fad-ac58-37004264a563\") " pod="openshift-console/console-f9d7485db-vhncm" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302220 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e40fc0c2-28f1-444d-8ff4-8f11faa1db2b-etcd-ca\") pod \"etcd-operator-b45778765-nqlbc\" (UID: \"e40fc0c2-28f1-444d-8ff4-8f11faa1db2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nqlbc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302235 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqnnj\" (UniqueName: \"kubernetes.io/projected/d98a1c63-4a49-437d-97a3-c9c37efce476-kube-api-access-nqnnj\") pod \"packageserver-d55dfcdfc-7trl6\" (UID: \"d98a1c63-4a49-437d-97a3-c9c37efce476\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7trl6" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302251 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/33ad6890-1e95-4a46-9f40-00cf0443d4a1-stats-auth\") pod \"router-default-5444994796-pldb7\" (UID: \"33ad6890-1e95-4a46-9f40-00cf0443d4a1\") " pod="openshift-ingress/router-default-5444994796-pldb7" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302265 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eee41022-2a9c-41c8-bbc8-ff42874c4050-registry-tls\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302277 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7402585-182c-46a1-a8ac-257c97421c8d-serving-cert\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302291 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a7fc2be6-0cb9-4fad-ac58-37004264a563-console-oauth-config\") pod \"console-f9d7485db-vhncm\" (UID: \"a7fc2be6-0cb9-4fad-ac58-37004264a563\") " pod="openshift-console/console-f9d7485db-vhncm" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302318 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzpv7\" (UniqueName: \"kubernetes.io/projected/1908bc6b-523d-4ffd-8e4c-e73f3e3b7ee3-kube-api-access-rzpv7\") pod \"multus-admission-controller-857f4d67dd-msfrj\" (UID: \"1908bc6b-523d-4ffd-8e4c-e73f3e3b7ee3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-msfrj" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302351 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxkkl\" (UniqueName: \"kubernetes.io/projected/a77db5eb-461b-49d7-a2d7-91be5bb52a09-kube-api-access-fxkkl\") pod \"catalog-operator-68c6474976-gt6wf\" (UID: \"a77db5eb-461b-49d7-a2d7-91be5bb52a09\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gt6wf" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302365 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5a1d5a2-358f-4445-9253-5a9bc8b9daf3-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wzp94\" (UID: \"b5a1d5a2-358f-4445-9253-5a9bc8b9daf3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzp94" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302389 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/13cf2230-0cbf-49b1-92f4-b49b8d72acec-audit-policies\") pod \"apiserver-7bbb656c7d-5xggk\" (UID: \"13cf2230-0cbf-49b1-92f4-b49b8d72acec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302419 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc25bf0a-f5e4-4f6d-9442-6b55a043f751-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rsbss\" (UID: \"dc25bf0a-f5e4-4f6d-9442-6b55a043f751\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rsbss" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302446 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/bc98cfae-5f65-416a-ba4b-24f601ff6209-certs\") pod \"machine-config-server-whvjt\" (UID: \"bc98cfae-5f65-416a-ba4b-24f601ff6209\") " pod="openshift-machine-config-operator/machine-config-server-whvjt" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302527 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b7402585-182c-46a1-a8ac-257c97421c8d-encryption-config\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302568 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/13cf2230-0cbf-49b1-92f4-b49b8d72acec-audit-dir\") pod \"apiserver-7bbb656c7d-5xggk\" (UID: \"13cf2230-0cbf-49b1-92f4-b49b8d72acec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302597 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/89fdb2fe-9563-4934-82c8-10c218d81bea-audit-policies\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302625 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldmfx\" (UniqueName: \"kubernetes.io/projected/0445dab5-5a42-4098-9fe0-e38b5650ecd6-kube-api-access-ldmfx\") pod \"csi-hostpathplugin-2wsmx\" (UID: \"0445dab5-5a42-4098-9fe0-e38b5650ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-2wsmx" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302654 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0a4f7efd-8434-48a7-979e-9f128e93797b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jxl9b\" (UID: \"0a4f7efd-8434-48a7-979e-9f128e93797b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jxl9b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302677 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3936549f-6055-4b0d-af18-7958d91c672f-secret-volume\") pod \"collect-profiles-29421540-hhxsf\" (UID: \"3936549f-6055-4b0d-af18-7958d91c672f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-hhxsf" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302693 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7402585-182c-46a1-a8ac-257c97421c8d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302711 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/147065ae-e48c-4712-8d72-016b2a692da8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-skgqs\" (UID: \"147065ae-e48c-4712-8d72-016b2a692da8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-skgqs" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302742 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/adaadc75-0b7d-4699-8d61-fe5ebb7dc84a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-9xbb6\" (UID: \"adaadc75-0b7d-4699-8d61-fe5ebb7dc84a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xbb6" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302757 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13cf2230-0cbf-49b1-92f4-b49b8d72acec-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5xggk\" (UID: \"13cf2230-0cbf-49b1-92f4-b49b8d72acec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302772 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eee41022-2a9c-41c8-bbc8-ff42874c4050-registry-certificates\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302798 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/897e7b14-cb3a-40e7-a127-587bc5e9502a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gkckz\" (UID: \"897e7b14-cb3a-40e7-a127-587bc5e9502a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gkckz" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302812 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0445dab5-5a42-4098-9fe0-e38b5650ecd6-mountpoint-dir\") pod \"csi-hostpathplugin-2wsmx\" (UID: \"0445dab5-5a42-4098-9fe0-e38b5650ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-2wsmx" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302825 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d15dddf-4a72-4590-9def-755a9fd11da6-serving-cert\") pod \"authentication-operator-69f744f599-5kplw\" (UID: \"2d15dddf-4a72-4590-9def-755a9fd11da6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5kplw" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302882 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ce81b97-c663-47e5-b75e-ba3333d5cb9a-config\") pod \"machine-approver-56656f9798-8fb58\" (UID: \"7ce81b97-c663-47e5-b75e-ba3333d5cb9a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8fb58" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302907 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6698v\" (UniqueName: \"kubernetes.io/projected/2d15dddf-4a72-4590-9def-755a9fd11da6-kube-api-access-6698v\") pod \"authentication-operator-69f744f599-5kplw\" (UID: \"2d15dddf-4a72-4590-9def-755a9fd11da6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5kplw" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302921 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b87660e6-af7f-42a9-bdb5-7bee97dab52f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mk586\" (UID: \"b87660e6-af7f-42a9-bdb5-7bee97dab52f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mk586" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302934 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jjbm\" (UniqueName: \"kubernetes.io/projected/13cf2230-0cbf-49b1-92f4-b49b8d72acec-kube-api-access-6jjbm\") pod \"apiserver-7bbb656c7d-5xggk\" (UID: \"13cf2230-0cbf-49b1-92f4-b49b8d72acec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302946 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/147065ae-e48c-4712-8d72-016b2a692da8-srv-cert\") pod \"olm-operator-6b444d44fb-skgqs\" (UID: \"147065ae-e48c-4712-8d72-016b2a692da8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-skgqs" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302961 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8c1c0513-5d12-463c-b66d-c1a5d02eafc0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-srlsw\" (UID: \"8c1c0513-5d12-463c-b66d-c1a5d02eafc0\") " pod="openshift-marketplace/marketplace-operator-79b997595-srlsw" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302974 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0445dab5-5a42-4098-9fe0-e38b5650ecd6-registration-dir\") pod \"csi-hostpathplugin-2wsmx\" (UID: \"0445dab5-5a42-4098-9fe0-e38b5650ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-2wsmx" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.302989 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fswsp\" (UniqueName: \"kubernetes.io/projected/c1b1ecba-4701-4e11-b2e9-1327cc2c5c2d-kube-api-access-fswsp\") pod \"ingress-canary-89w6m\" (UID: \"c1b1ecba-4701-4e11-b2e9-1327cc2c5c2d\") " pod="openshift-ingress-canary/ingress-canary-89w6m" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303012 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqd8z\" (UniqueName: \"kubernetes.io/projected/b87660e6-af7f-42a9-bdb5-7bee97dab52f-kube-api-access-fqd8z\") pod \"openshift-apiserver-operator-796bbdcf4f-mk586\" (UID: \"b87660e6-af7f-42a9-bdb5-7bee97dab52f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mk586" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303025 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b7402585-182c-46a1-a8ac-257c97421c8d-image-import-ca\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303095 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns6z2\" (UniqueName: \"kubernetes.io/projected/a7fc2be6-0cb9-4fad-ac58-37004264a563-kube-api-access-ns6z2\") pod \"console-f9d7485db-vhncm\" (UID: \"a7fc2be6-0cb9-4fad-ac58-37004264a563\") " pod="openshift-console/console-f9d7485db-vhncm" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303115 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/adaadc75-0b7d-4699-8d61-fe5ebb7dc84a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-9xbb6\" (UID: \"adaadc75-0b7d-4699-8d61-fe5ebb7dc84a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xbb6" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303129 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b7402585-182c-46a1-a8ac-257c97421c8d-node-pullsecrets\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303188 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b7402585-182c-46a1-a8ac-257c97421c8d-etcd-serving-ca\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: E1209 15:00:25.303230 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:25.803213489 +0000 UTC m=+135.939926208 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303257 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7ce81b97-c663-47e5-b75e-ba3333d5cb9a-machine-approver-tls\") pod \"machine-approver-56656f9798-8fb58\" (UID: \"7ce81b97-c663-47e5-b75e-ba3333d5cb9a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8fb58" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303293 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl7qg\" (UniqueName: \"kubernetes.io/projected/dc25bf0a-f5e4-4f6d-9442-6b55a043f751-kube-api-access-zl7qg\") pod \"control-plane-machine-set-operator-78cbb6b69f-rsbss\" (UID: \"dc25bf0a-f5e4-4f6d-9442-6b55a043f751\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rsbss" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303330 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5446f927-f240-4554-832a-28b950424eea-metrics-tls\") pod \"dns-operator-744455d44c-srtkn\" (UID: \"5446f927-f240-4554-832a-28b950424eea\") " pod="openshift-dns-operator/dns-operator-744455d44c-srtkn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303347 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303362 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303378 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303397 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77dac49a-34a8-421c-99ab-1fa4a08fc6fa-serving-cert\") pod \"openshift-config-operator-7777fb866f-mtdft\" (UID: \"77dac49a-34a8-421c-99ab-1fa4a08fc6fa\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtdft" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303404 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303416 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303432 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrqkx\" (UniqueName: \"kubernetes.io/projected/89fdb2fe-9563-4934-82c8-10c218d81bea-kube-api-access-xrqkx\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303449 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/13cf2230-0cbf-49b1-92f4-b49b8d72acec-etcd-client\") pod \"apiserver-7bbb656c7d-5xggk\" (UID: \"13cf2230-0cbf-49b1-92f4-b49b8d72acec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303474 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d98a1c63-4a49-437d-97a3-c9c37efce476-apiservice-cert\") pod \"packageserver-d55dfcdfc-7trl6\" (UID: \"d98a1c63-4a49-437d-97a3-c9c37efce476\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7trl6" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303493 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5a1d5a2-358f-4445-9253-5a9bc8b9daf3-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wzp94\" (UID: \"b5a1d5a2-358f-4445-9253-5a9bc8b9daf3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzp94" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303508 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e376f745-76d4-42dd-97f7-4554c02eec74-config\") pod \"kube-controller-manager-operator-78b949d7b-pw6lj\" (UID: \"e376f745-76d4-42dd-97f7-4554c02eec74\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pw6lj" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303523 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/33ad6890-1e95-4a46-9f40-00cf0443d4a1-default-certificate\") pod \"router-default-5444994796-pldb7\" (UID: \"33ad6890-1e95-4a46-9f40-00cf0443d4a1\") " pod="openshift-ingress/router-default-5444994796-pldb7" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303558 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33ad6890-1e95-4a46-9f40-00cf0443d4a1-metrics-certs\") pod \"router-default-5444994796-pldb7\" (UID: \"33ad6890-1e95-4a46-9f40-00cf0443d4a1\") " pod="openshift-ingress/router-default-5444994796-pldb7" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303588 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3f2ca3d-d5ad-4a22-96d4-3bc4c5eb7a96-trusted-ca\") pod \"ingress-operator-5b745b69d9-2f6gc\" (UID: \"d3f2ca3d-d5ad-4a22-96d4-3bc4c5eb7a96\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f6gc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303605 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj6jr\" (UniqueName: \"kubernetes.io/projected/bc98cfae-5f65-416a-ba4b-24f601ff6209-kube-api-access-wj6jr\") pod \"machine-config-server-whvjt\" (UID: \"bc98cfae-5f65-416a-ba4b-24f601ff6209\") " pod="openshift-machine-config-operator/machine-config-server-whvjt" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303620 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b7402585-182c-46a1-a8ac-257c97421c8d-audit\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303635 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/89fdb2fe-9563-4934-82c8-10c218d81bea-audit-dir\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303649 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d98a1c63-4a49-437d-97a3-c9c37efce476-webhook-cert\") pod \"packageserver-d55dfcdfc-7trl6\" (UID: \"d98a1c63-4a49-437d-97a3-c9c37efce476\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7trl6" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303664 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/adaadc75-0b7d-4699-8d61-fe5ebb7dc84a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-9xbb6\" (UID: \"adaadc75-0b7d-4699-8d61-fe5ebb7dc84a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xbb6" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303711 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5926dc72-d5e9-4c49-afba-ac1ce69678be-serving-cert\") pod \"console-operator-58897d9998-7drj7\" (UID: \"5926dc72-d5e9-4c49-afba-ac1ce69678be\") " pod="openshift-console-operator/console-operator-58897d9998-7drj7" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303730 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv9fc\" (UniqueName: \"kubernetes.io/projected/7ce81b97-c663-47e5-b75e-ba3333d5cb9a-kube-api-access-nv9fc\") pod \"machine-approver-56656f9798-8fb58\" (UID: \"7ce81b97-c663-47e5-b75e-ba3333d5cb9a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8fb58" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.303746 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.304011 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/897e7b14-cb3a-40e7-a127-587bc5e9502a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gkckz\" (UID: \"897e7b14-cb3a-40e7-a127-587bc5e9502a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gkckz" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.304355 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13cf2230-0cbf-49b1-92f4-b49b8d72acec-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5xggk\" (UID: \"13cf2230-0cbf-49b1-92f4-b49b8d72acec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.305188 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eee41022-2a9c-41c8-bbc8-ff42874c4050-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.305218 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f6a1efb-6c81-45a4-91e2-9f67f8a266be-config\") pod \"service-ca-operator-777779d784-b27rg\" (UID: \"6f6a1efb-6c81-45a4-91e2-9f67f8a266be\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b27rg" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.305259 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5926dc72-d5e9-4c49-afba-ac1ce69678be-config\") pod \"console-operator-58897d9998-7drj7\" (UID: \"5926dc72-d5e9-4c49-afba-ac1ce69678be\") " pod="openshift-console-operator/console-operator-58897d9998-7drj7" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.305277 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3f2ca3d-d5ad-4a22-96d4-3bc4c5eb7a96-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2f6gc\" (UID: \"d3f2ca3d-d5ad-4a22-96d4-3bc4c5eb7a96\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f6gc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.305484 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b7402585-182c-46a1-a8ac-257c97421c8d-node-pullsecrets\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.305996 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5926dc72-d5e9-4c49-afba-ac1ce69678be-config\") pod \"console-operator-58897d9998-7drj7\" (UID: \"5926dc72-d5e9-4c49-afba-ac1ce69678be\") " pod="openshift-console-operator/console-operator-58897d9998-7drj7" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.306049 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/89fdb2fe-9563-4934-82c8-10c218d81bea-audit-dir\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.306154 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f6a1efb-6c81-45a4-91e2-9f67f8a266be-config\") pod \"service-ca-operator-777779d784-b27rg\" (UID: \"6f6a1efb-6c81-45a4-91e2-9f67f8a266be\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b27rg" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.306277 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx7h6\" (UniqueName: \"kubernetes.io/projected/d3f2ca3d-d5ad-4a22-96d4-3bc4c5eb7a96-kube-api-access-sx7h6\") pod \"ingress-operator-5b745b69d9-2f6gc\" (UID: \"d3f2ca3d-d5ad-4a22-96d4-3bc4c5eb7a96\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f6gc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.306376 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.306397 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0445dab5-5a42-4098-9fe0-e38b5650ecd6-plugins-dir\") pod \"csi-hostpathplugin-2wsmx\" (UID: \"0445dab5-5a42-4098-9fe0-e38b5650ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-2wsmx" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.306416 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2267e449-66c1-4699-897b-9851e54d2645-config-volume\") pod \"dns-default-9l7sb\" (UID: \"2267e449-66c1-4699-897b-9851e54d2645\") " pod="openshift-dns/dns-default-9l7sb" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.306433 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7402585-182c-46a1-a8ac-257c97421c8d-config\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.306449 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/13cf2230-0cbf-49b1-92f4-b49b8d72acec-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5xggk\" (UID: \"13cf2230-0cbf-49b1-92f4-b49b8d72acec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.306477 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmt28\" (UniqueName: \"kubernetes.io/projected/adaadc75-0b7d-4699-8d61-fe5ebb7dc84a-kube-api-access-gmt28\") pod \"cluster-image-registry-operator-dc59b4c8b-9xbb6\" (UID: \"adaadc75-0b7d-4699-8d61-fe5ebb7dc84a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xbb6" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.306493 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs7v4\" (UniqueName: \"kubernetes.io/projected/468e155a-66ab-4823-b737-7e1f9c78ac3f-kube-api-access-gs7v4\") pod \"package-server-manager-789f6589d5-wt67p\" (UID: \"468e155a-66ab-4823-b737-7e1f9c78ac3f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wt67p" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.306510 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c03f49e6-2884-451b-af8c-9d47555d1413-signing-key\") pod \"service-ca-9c57cc56f-4tvxr\" (UID: \"c03f49e6-2884-451b-af8c-9d47555d1413\") " pod="openshift-service-ca/service-ca-9c57cc56f-4tvxr" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.306527 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3f2ca3d-d5ad-4a22-96d4-3bc4c5eb7a96-metrics-tls\") pod \"ingress-operator-5b745b69d9-2f6gc\" (UID: \"d3f2ca3d-d5ad-4a22-96d4-3bc4c5eb7a96\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f6gc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.306542 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13cf2230-0cbf-49b1-92f4-b49b8d72acec-serving-cert\") pod \"apiserver-7bbb656c7d-5xggk\" (UID: \"13cf2230-0cbf-49b1-92f4-b49b8d72acec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.306558 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b87660e6-af7f-42a9-bdb5-7bee97dab52f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mk586\" (UID: \"b87660e6-af7f-42a9-bdb5-7bee97dab52f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mk586" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.306572 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a77db5eb-461b-49d7-a2d7-91be5bb52a09-srv-cert\") pod \"catalog-operator-68c6474976-gt6wf\" (UID: \"a77db5eb-461b-49d7-a2d7-91be5bb52a09\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gt6wf" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.306597 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d15dddf-4a72-4590-9def-755a9fd11da6-service-ca-bundle\") pod \"authentication-operator-69f744f599-5kplw\" (UID: \"2d15dddf-4a72-4590-9def-755a9fd11da6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5kplw" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.306600 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/13cf2230-0cbf-49b1-92f4-b49b8d72acec-audit-policies\") pod \"apiserver-7bbb656c7d-5xggk\" (UID: \"13cf2230-0cbf-49b1-92f4-b49b8d72acec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.306613 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h2lr\" (UniqueName: \"kubernetes.io/projected/c03f49e6-2884-451b-af8c-9d47555d1413-kube-api-access-7h2lr\") pod \"service-ca-9c57cc56f-4tvxr\" (UID: \"c03f49e6-2884-451b-af8c-9d47555d1413\") " pod="openshift-service-ca/service-ca-9c57cc56f-4tvxr" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.306631 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/895674c7-2ae3-4c93-8fcf-a2e1b7f2dfcb-proxy-tls\") pod \"machine-config-controller-84d6567774-hgkns\" (UID: \"895674c7-2ae3-4c93-8fcf-a2e1b7f2dfcb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hgkns" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.306653 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpcg2\" (UniqueName: \"kubernetes.io/projected/b7402585-182c-46a1-a8ac-257c97421c8d-kube-api-access-cpcg2\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.306671 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/468e155a-66ab-4823-b737-7e1f9c78ac3f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-wt67p\" (UID: \"468e155a-66ab-4823-b737-7e1f9c78ac3f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wt67p" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.306689 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/897e7b14-cb3a-40e7-a127-587bc5e9502a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gkckz\" (UID: \"897e7b14-cb3a-40e7-a127-587bc5e9502a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gkckz" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.306704 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/895674c7-2ae3-4c93-8fcf-a2e1b7f2dfcb-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hgkns\" (UID: \"895674c7-2ae3-4c93-8fcf-a2e1b7f2dfcb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hgkns" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.306718 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68q49\" (UniqueName: \"kubernetes.io/projected/147065ae-e48c-4712-8d72-016b2a692da8-kube-api-access-68q49\") pod \"olm-operator-6b444d44fb-skgqs\" (UID: \"147065ae-e48c-4712-8d72-016b2a692da8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-skgqs" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.306733 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d73470df-6b49-4a9f-bcd8-a0acac5cb921-auth-proxy-config\") pod \"machine-config-operator-74547568cd-r9tjc\" (UID: \"d73470df-6b49-4a9f-bcd8-a0acac5cb921\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9tjc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.306754 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.306770 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5926dc72-d5e9-4c49-afba-ac1ce69678be-trusted-ca\") pod \"console-operator-58897d9998-7drj7\" (UID: \"5926dc72-d5e9-4c49-afba-ac1ce69678be\") " pod="openshift-console-operator/console-operator-58897d9998-7drj7" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.306786 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a7fc2be6-0cb9-4fad-ac58-37004264a563-service-ca\") pod \"console-f9d7485db-vhncm\" (UID: \"a7fc2be6-0cb9-4fad-ac58-37004264a563\") " pod="openshift-console/console-f9d7485db-vhncm" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.306802 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.307019 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eee41022-2a9c-41c8-bbc8-ff42874c4050-registry-certificates\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.307469 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/adaadc75-0b7d-4699-8d61-fe5ebb7dc84a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-9xbb6\" (UID: \"adaadc75-0b7d-4699-8d61-fe5ebb7dc84a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xbb6" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.307541 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.307676 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d15dddf-4a72-4590-9def-755a9fd11da6-service-ca-bundle\") pod \"authentication-operator-69f744f599-5kplw\" (UID: \"2d15dddf-4a72-4590-9def-755a9fd11da6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5kplw" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.307681 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b7402585-182c-46a1-a8ac-257c97421c8d-image-import-ca\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.307838 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5926dc72-d5e9-4c49-afba-ac1ce69678be-trusted-ca\") pod \"console-operator-58897d9998-7drj7\" (UID: \"5926dc72-d5e9-4c49-afba-ac1ce69678be\") " pod="openshift-console-operator/console-operator-58897d9998-7drj7" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.308339 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e376f745-76d4-42dd-97f7-4554c02eec74-config\") pod \"kube-controller-manager-operator-78b949d7b-pw6lj\" (UID: \"e376f745-76d4-42dd-97f7-4554c02eec74\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pw6lj" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.309110 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7402585-182c-46a1-a8ac-257c97421c8d-config\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.309873 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e40fc0c2-28f1-444d-8ff4-8f11faa1db2b-serving-cert\") pod \"etcd-operator-b45778765-nqlbc\" (UID: \"e40fc0c2-28f1-444d-8ff4-8f11faa1db2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nqlbc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.309903 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e40fc0c2-28f1-444d-8ff4-8f11faa1db2b-etcd-service-ca\") pod \"etcd-operator-b45778765-nqlbc\" (UID: \"e40fc0c2-28f1-444d-8ff4-8f11faa1db2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nqlbc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.309922 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b7402585-182c-46a1-a8ac-257c97421c8d-etcd-client\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.309939 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e376f745-76d4-42dd-97f7-4554c02eec74-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-pw6lj\" (UID: \"e376f745-76d4-42dd-97f7-4554c02eec74\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pw6lj" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.309964 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df2gb\" (UniqueName: \"kubernetes.io/projected/33ad6890-1e95-4a46-9f40-00cf0443d4a1-kube-api-access-df2gb\") pod \"router-default-5444994796-pldb7\" (UID: \"33ad6890-1e95-4a46-9f40-00cf0443d4a1\") " pod="openshift-ingress/router-default-5444994796-pldb7" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.309982 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4dds\" (UniqueName: \"kubernetes.io/projected/8c1c0513-5d12-463c-b66d-c1a5d02eafc0-kube-api-access-z4dds\") pod \"marketplace-operator-79b997595-srlsw\" (UID: \"8c1c0513-5d12-463c-b66d-c1a5d02eafc0\") " pod="openshift-marketplace/marketplace-operator-79b997595-srlsw" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.310001 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eee41022-2a9c-41c8-bbc8-ff42874c4050-bound-sa-token\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.310018 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a7fc2be6-0cb9-4fad-ac58-37004264a563-oauth-serving-cert\") pod \"console-f9d7485db-vhncm\" (UID: \"a7fc2be6-0cb9-4fad-ac58-37004264a563\") " pod="openshift-console/console-f9d7485db-vhncm" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.310033 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxzb9\" (UniqueName: \"kubernetes.io/projected/2267e449-66c1-4699-897b-9851e54d2645-kube-api-access-jxzb9\") pod \"dns-default-9l7sb\" (UID: \"2267e449-66c1-4699-897b-9851e54d2645\") " pod="openshift-dns/dns-default-9l7sb" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.310059 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/13cf2230-0cbf-49b1-92f4-b49b8d72acec-encryption-config\") pod \"apiserver-7bbb656c7d-5xggk\" (UID: \"13cf2230-0cbf-49b1-92f4-b49b8d72acec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.310134 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0445dab5-5a42-4098-9fe0-e38b5650ecd6-csi-data-dir\") pod \"csi-hostpathplugin-2wsmx\" (UID: \"0445dab5-5a42-4098-9fe0-e38b5650ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-2wsmx" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.310155 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d73470df-6b49-4a9f-bcd8-a0acac5cb921-images\") pod \"machine-config-operator-74547568cd-r9tjc\" (UID: \"d73470df-6b49-4a9f-bcd8-a0acac5cb921\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9tjc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.310171 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbxs9\" (UniqueName: \"kubernetes.io/projected/895674c7-2ae3-4c93-8fcf-a2e1b7f2dfcb-kube-api-access-mbxs9\") pod \"machine-config-controller-84d6567774-hgkns\" (UID: \"895674c7-2ae3-4c93-8fcf-a2e1b7f2dfcb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hgkns" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.310206 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a77db5eb-461b-49d7-a2d7-91be5bb52a09-profile-collector-cert\") pod \"catalog-operator-68c6474976-gt6wf\" (UID: \"a77db5eb-461b-49d7-a2d7-91be5bb52a09\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gt6wf" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.310223 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwzpz\" (UniqueName: \"kubernetes.io/projected/5446f927-f240-4554-832a-28b950424eea-kube-api-access-qwzpz\") pod \"dns-operator-744455d44c-srtkn\" (UID: \"5446f927-f240-4554-832a-28b950424eea\") " pod="openshift-dns-operator/dns-operator-744455d44c-srtkn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.310238 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnspm\" (UniqueName: \"kubernetes.io/projected/5926dc72-d5e9-4c49-afba-ac1ce69678be-kube-api-access-dnspm\") pod \"console-operator-58897d9998-7drj7\" (UID: \"5926dc72-d5e9-4c49-afba-ac1ce69678be\") " pod="openshift-console-operator/console-operator-58897d9998-7drj7" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.310242 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a7fc2be6-0cb9-4fad-ac58-37004264a563-console-config\") pod \"console-f9d7485db-vhncm\" (UID: \"a7fc2be6-0cb9-4fad-ac58-37004264a563\") " pod="openshift-console/console-f9d7485db-vhncm" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.310253 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/77dac49a-34a8-421c-99ab-1fa4a08fc6fa-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mtdft\" (UID: \"77dac49a-34a8-421c-99ab-1fa4a08fc6fa\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtdft" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.310331 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn2kc\" (UniqueName: \"kubernetes.io/projected/eee41022-2a9c-41c8-bbc8-ff42874c4050-kube-api-access-dn2kc\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.310354 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7ce81b97-c663-47e5-b75e-ba3333d5cb9a-auth-proxy-config\") pod \"machine-approver-56656f9798-8fb58\" (UID: \"7ce81b97-c663-47e5-b75e-ba3333d5cb9a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8fb58" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.310375 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d98a1c63-4a49-437d-97a3-c9c37efce476-tmpfs\") pod \"packageserver-d55dfcdfc-7trl6\" (UID: \"d98a1c63-4a49-437d-97a3-c9c37efce476\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7trl6" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.310417 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrkvg\" (UniqueName: \"kubernetes.io/projected/d73470df-6b49-4a9f-bcd8-a0acac5cb921-kube-api-access-vrkvg\") pod \"machine-config-operator-74547568cd-r9tjc\" (UID: \"d73470df-6b49-4a9f-bcd8-a0acac5cb921\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9tjc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.310436 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b7402585-182c-46a1-a8ac-257c97421c8d-audit-dir\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.310476 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvf7f\" (UniqueName: \"kubernetes.io/projected/dd026b3b-b5df-4f5a-81c8-2bf0b6b8c766-kube-api-access-zvf7f\") pod \"migrator-59844c95c7-89bvk\" (UID: \"dd026b3b-b5df-4f5a-81c8-2bf0b6b8c766\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-89bvk" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.310494 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c1c0513-5d12-463c-b66d-c1a5d02eafc0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-srlsw\" (UID: \"8c1c0513-5d12-463c-b66d-c1a5d02eafc0\") " pod="openshift-marketplace/marketplace-operator-79b997595-srlsw" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.310498 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5a1d5a2-358f-4445-9253-5a9bc8b9daf3-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wzp94\" (UID: \"b5a1d5a2-358f-4445-9253-5a9bc8b9daf3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzp94" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.310511 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e40fc0c2-28f1-444d-8ff4-8f11faa1db2b-config\") pod \"etcd-operator-b45778765-nqlbc\" (UID: \"e40fc0c2-28f1-444d-8ff4-8f11faa1db2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nqlbc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.310528 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c03f49e6-2884-451b-af8c-9d47555d1413-signing-cabundle\") pod \"service-ca-9c57cc56f-4tvxr\" (UID: \"c03f49e6-2884-451b-af8c-9d47555d1413\") " pod="openshift-service-ca/service-ca-9c57cc56f-4tvxr" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.310555 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7402585-182c-46a1-a8ac-257c97421c8d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.310563 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0445dab5-5a42-4098-9fe0-e38b5650ecd6-socket-dir\") pod \"csi-hostpathplugin-2wsmx\" (UID: \"0445dab5-5a42-4098-9fe0-e38b5650ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-2wsmx" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.310596 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.310604 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/77dac49a-34a8-421c-99ab-1fa4a08fc6fa-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mtdft\" (UID: \"77dac49a-34a8-421c-99ab-1fa4a08fc6fa\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtdft" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.310612 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2267e449-66c1-4699-897b-9851e54d2645-metrics-tls\") pod \"dns-default-9l7sb\" (UID: \"2267e449-66c1-4699-897b-9851e54d2645\") " pod="openshift-dns/dns-default-9l7sb" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.310651 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8jwt\" (UniqueName: \"kubernetes.io/projected/77dac49a-34a8-421c-99ab-1fa4a08fc6fa-kube-api-access-p8jwt\") pod \"openshift-config-operator-7777fb866f-mtdft\" (UID: \"77dac49a-34a8-421c-99ab-1fa4a08fc6fa\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtdft" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.325161 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e40fc0c2-28f1-444d-8ff4-8f11faa1db2b-etcd-ca\") pod \"etcd-operator-b45778765-nqlbc\" (UID: \"e40fc0c2-28f1-444d-8ff4-8f11faa1db2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nqlbc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.326519 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/13cf2230-0cbf-49b1-92f4-b49b8d72acec-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5xggk\" (UID: \"13cf2230-0cbf-49b1-92f4-b49b8d72acec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.327050 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eee41022-2a9c-41c8-bbc8-ff42874c4050-registry-tls\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.327105 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/13cf2230-0cbf-49b1-92f4-b49b8d72acec-audit-dir\") pod \"apiserver-7bbb656c7d-5xggk\" (UID: \"13cf2230-0cbf-49b1-92f4-b49b8d72acec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.327222 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7ce81b97-c663-47e5-b75e-ba3333d5cb9a-auth-proxy-config\") pod \"machine-approver-56656f9798-8fb58\" (UID: \"7ce81b97-c663-47e5-b75e-ba3333d5cb9a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8fb58" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.327279 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b7402585-182c-46a1-a8ac-257c97421c8d-audit-dir\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.327427 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.327542 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b7402585-182c-46a1-a8ac-257c97421c8d-encryption-config\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.327548 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5a1d5a2-358f-4445-9253-5a9bc8b9daf3-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wzp94\" (UID: \"b5a1d5a2-358f-4445-9253-5a9bc8b9daf3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzp94" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.327661 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b87660e6-af7f-42a9-bdb5-7bee97dab52f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mk586\" (UID: \"b87660e6-af7f-42a9-bdb5-7bee97dab52f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mk586" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.327765 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e40fc0c2-28f1-444d-8ff4-8f11faa1db2b-config\") pod \"etcd-operator-b45778765-nqlbc\" (UID: \"e40fc0c2-28f1-444d-8ff4-8f11faa1db2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nqlbc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.327892 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.327899 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7402585-182c-46a1-a8ac-257c97421c8d-serving-cert\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.328099 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eee41022-2a9c-41c8-bbc8-ff42874c4050-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.328354 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/89fdb2fe-9563-4934-82c8-10c218d81bea-audit-policies\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.328373 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5926dc72-d5e9-4c49-afba-ac1ce69678be-serving-cert\") pod \"console-operator-58897d9998-7drj7\" (UID: \"5926dc72-d5e9-4c49-afba-ac1ce69678be\") " pod="openshift-console-operator/console-operator-58897d9998-7drj7" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.328560 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f6a1efb-6c81-45a4-91e2-9f67f8a266be-serving-cert\") pod \"service-ca-operator-777779d784-b27rg\" (UID: \"6f6a1efb-6c81-45a4-91e2-9f67f8a266be\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b27rg" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.328610 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0a4f7efd-8434-48a7-979e-9f128e93797b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jxl9b\" (UID: \"0a4f7efd-8434-48a7-979e-9f128e93797b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jxl9b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.328709 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.328852 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.328887 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3f2ca3d-d5ad-4a22-96d4-3bc4c5eb7a96-metrics-tls\") pod \"ingress-operator-5b745b69d9-2f6gc\" (UID: \"d3f2ca3d-d5ad-4a22-96d4-3bc4c5eb7a96\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f6gc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.328973 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/897e7b14-cb3a-40e7-a127-587bc5e9502a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gkckz\" (UID: \"897e7b14-cb3a-40e7-a127-587bc5e9502a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gkckz" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.328996 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eee41022-2a9c-41c8-bbc8-ff42874c4050-trusted-ca\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.329021 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbl9s\" (UniqueName: \"kubernetes.io/projected/b5a1d5a2-358f-4445-9253-5a9bc8b9daf3-kube-api-access-xbl9s\") pod \"openshift-controller-manager-operator-756b6f6bc6-wzp94\" (UID: \"b5a1d5a2-358f-4445-9253-5a9bc8b9daf3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzp94" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.329047 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/bc98cfae-5f65-416a-ba4b-24f601ff6209-node-bootstrap-token\") pod \"machine-config-server-whvjt\" (UID: \"bc98cfae-5f65-416a-ba4b-24f601ff6209\") " pod="openshift-machine-config-operator/machine-config-server-whvjt" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.329089 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbhqh\" (UniqueName: \"kubernetes.io/projected/3936549f-6055-4b0d-af18-7958d91c672f-kube-api-access-wbhqh\") pod \"collect-profiles-29421540-hhxsf\" (UID: \"3936549f-6055-4b0d-af18-7958d91c672f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-hhxsf" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.329095 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.329114 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d15dddf-4a72-4590-9def-755a9fd11da6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-5kplw\" (UID: \"2d15dddf-4a72-4590-9def-755a9fd11da6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5kplw" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.329172 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/13cf2230-0cbf-49b1-92f4-b49b8d72acec-etcd-client\") pod \"apiserver-7bbb656c7d-5xggk\" (UID: \"13cf2230-0cbf-49b1-92f4-b49b8d72acec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.330527 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ce81b97-c663-47e5-b75e-ba3333d5cb9a-config\") pod \"machine-approver-56656f9798-8fb58\" (UID: \"7ce81b97-c663-47e5-b75e-ba3333d5cb9a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8fb58" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.330855 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13cf2230-0cbf-49b1-92f4-b49b8d72acec-serving-cert\") pod \"apiserver-7bbb656c7d-5xggk\" (UID: \"13cf2230-0cbf-49b1-92f4-b49b8d72acec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.331002 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eee41022-2a9c-41c8-bbc8-ff42874c4050-trusted-ca\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.331318 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eee41022-2a9c-41c8-bbc8-ff42874c4050-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.331688 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d15dddf-4a72-4590-9def-755a9fd11da6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-5kplw\" (UID: \"2d15dddf-4a72-4590-9def-755a9fd11da6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5kplw" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.331713 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a7fc2be6-0cb9-4fad-ac58-37004264a563-console-oauth-config\") pod \"console-f9d7485db-vhncm\" (UID: \"a7fc2be6-0cb9-4fad-ac58-37004264a563\") " pod="openshift-console/console-f9d7485db-vhncm" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.331747 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d15dddf-4a72-4590-9def-755a9fd11da6-config\") pod \"authentication-operator-69f744f599-5kplw\" (UID: \"2d15dddf-4a72-4590-9def-755a9fd11da6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5kplw" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.331774 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7fc2be6-0cb9-4fad-ac58-37004264a563-console-serving-cert\") pod \"console-f9d7485db-vhncm\" (UID: \"a7fc2be6-0cb9-4fad-ac58-37004264a563\") " pod="openshift-console/console-f9d7485db-vhncm" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.331804 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3936549f-6055-4b0d-af18-7958d91c672f-config-volume\") pod \"collect-profiles-29421540-hhxsf\" (UID: \"3936549f-6055-4b0d-af18-7958d91c672f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-hhxsf" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.331834 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-llvq5"] Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.332011 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/13cf2230-0cbf-49b1-92f4-b49b8d72acec-encryption-config\") pod \"apiserver-7bbb656c7d-5xggk\" (UID: \"13cf2230-0cbf-49b1-92f4-b49b8d72acec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.332212 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77dac49a-34a8-421c-99ab-1fa4a08fc6fa-serving-cert\") pod \"openshift-config-operator-7777fb866f-mtdft\" (UID: \"77dac49a-34a8-421c-99ab-1fa4a08fc6fa\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtdft" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.332711 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d15dddf-4a72-4590-9def-755a9fd11da6-config\") pod \"authentication-operator-69f744f599-5kplw\" (UID: \"2d15dddf-4a72-4590-9def-755a9fd11da6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5kplw" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.333330 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.333706 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a7fc2be6-0cb9-4fad-ac58-37004264a563-oauth-serving-cert\") pod \"console-f9d7485db-vhncm\" (UID: \"a7fc2be6-0cb9-4fad-ac58-37004264a563\") " pod="openshift-console/console-f9d7485db-vhncm" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.333907 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b87660e6-af7f-42a9-bdb5-7bee97dab52f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mk586\" (UID: \"b87660e6-af7f-42a9-bdb5-7bee97dab52f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mk586" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.334772 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e40fc0c2-28f1-444d-8ff4-8f11faa1db2b-serving-cert\") pod \"etcd-operator-b45778765-nqlbc\" (UID: \"e40fc0c2-28f1-444d-8ff4-8f11faa1db2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nqlbc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.334867 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e40fc0c2-28f1-444d-8ff4-8f11faa1db2b-etcd-service-ca\") pod \"etcd-operator-b45778765-nqlbc\" (UID: \"e40fc0c2-28f1-444d-8ff4-8f11faa1db2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nqlbc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.335918 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7ce81b97-c663-47e5-b75e-ba3333d5cb9a-machine-approver-tls\") pod \"machine-approver-56656f9798-8fb58\" (UID: \"7ce81b97-c663-47e5-b75e-ba3333d5cb9a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8fb58" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.336194 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7fc2be6-0cb9-4fad-ac58-37004264a563-console-serving-cert\") pod \"console-f9d7485db-vhncm\" (UID: \"a7fc2be6-0cb9-4fad-ac58-37004264a563\") " pod="openshift-console/console-f9d7485db-vhncm" Dec 09 15:00:25 crc kubenswrapper[4555]: E1209 15:00:25.336589 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:25.836545594 +0000 UTC m=+135.973258312 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.337095 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b7402585-182c-46a1-a8ac-257c97421c8d-etcd-client\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.337275 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b7402585-182c-46a1-a8ac-257c97421c8d-audit\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.337804 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/adaadc75-0b7d-4699-8d61-fe5ebb7dc84a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-9xbb6\" (UID: \"adaadc75-0b7d-4699-8d61-fe5ebb7dc84a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xbb6" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.338064 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3f2ca3d-d5ad-4a22-96d4-3bc4c5eb7a96-trusted-ca\") pod \"ingress-operator-5b745b69d9-2f6gc\" (UID: \"d3f2ca3d-d5ad-4a22-96d4-3bc4c5eb7a96\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f6gc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.338252 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7fc2be6-0cb9-4fad-ac58-37004264a563-trusted-ca-bundle\") pod \"console-f9d7485db-vhncm\" (UID: \"a7fc2be6-0cb9-4fad-ac58-37004264a563\") " pod="openshift-console/console-f9d7485db-vhncm" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.338519 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b7402585-182c-46a1-a8ac-257c97421c8d-etcd-serving-ca\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.338816 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a7fc2be6-0cb9-4fad-ac58-37004264a563-service-ca\") pod \"console-f9d7485db-vhncm\" (UID: \"a7fc2be6-0cb9-4fad-ac58-37004264a563\") " pod="openshift-console/console-f9d7485db-vhncm" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.338970 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.339096 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.339807 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5446f927-f240-4554-832a-28b950424eea-metrics-tls\") pod \"dns-operator-744455d44c-srtkn\" (UID: \"5446f927-f240-4554-832a-28b950424eea\") " pod="openshift-dns-operator/dns-operator-744455d44c-srtkn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.340358 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.341766 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d15dddf-4a72-4590-9def-755a9fd11da6-serving-cert\") pod \"authentication-operator-69f744f599-5kplw\" (UID: \"2d15dddf-4a72-4590-9def-755a9fd11da6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5kplw" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.343762 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e40fc0c2-28f1-444d-8ff4-8f11faa1db2b-etcd-client\") pod \"etcd-operator-b45778765-nqlbc\" (UID: \"e40fc0c2-28f1-444d-8ff4-8f11faa1db2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nqlbc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.344937 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e376f745-76d4-42dd-97f7-4554c02eec74-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-pw6lj\" (UID: \"e376f745-76d4-42dd-97f7-4554c02eec74\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pw6lj" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.354840 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns6z2\" (UniqueName: \"kubernetes.io/projected/a7fc2be6-0cb9-4fad-ac58-37004264a563-kube-api-access-ns6z2\") pod \"console-f9d7485db-vhncm\" (UID: \"a7fc2be6-0cb9-4fad-ac58-37004264a563\") " pod="openshift-console/console-f9d7485db-vhncm" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.365214 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dlj7\" (UniqueName: \"kubernetes.io/projected/e40fc0c2-28f1-444d-8ff4-8f11faa1db2b-kube-api-access-9dlj7\") pod \"etcd-operator-b45778765-nqlbc\" (UID: \"e40fc0c2-28f1-444d-8ff4-8f11faa1db2b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nqlbc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.380751 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-5ncnf" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.386117 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqd8z\" (UniqueName: \"kubernetes.io/projected/b87660e6-af7f-42a9-bdb5-7bee97dab52f-kube-api-access-fqd8z\") pod \"openshift-apiserver-operator-796bbdcf4f-mk586\" (UID: \"b87660e6-af7f-42a9-bdb5-7bee97dab52f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mk586" Dec 09 15:00:25 crc kubenswrapper[4555]: W1209 15:00:25.398834 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61f20e90_b252_48ed_978a_8aec4b3d98ba.slice/crio-72af49ec40c6496f55ef9288eccb1aef70679e26c7a012c89574cc4c8df5fcff WatchSource:0}: Error finding container 72af49ec40c6496f55ef9288eccb1aef70679e26c7a012c89574cc4c8df5fcff: Status 404 returned error can't find the container with id 72af49ec40c6496f55ef9288eccb1aef70679e26c7a012c89574cc4c8df5fcff Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.403068 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7h6sb"] Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.405952 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrqkx\" (UniqueName: \"kubernetes.io/projected/89fdb2fe-9563-4934-82c8-10c218d81bea-kube-api-access-xrqkx\") pod \"oauth-openshift-558db77b4-v5j6b\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.424727 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3f2ca3d-d5ad-4a22-96d4-3bc4c5eb7a96-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2f6gc\" (UID: \"d3f2ca3d-d5ad-4a22-96d4-3bc4c5eb7a96\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f6gc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.432578 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.432851 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0445dab5-5a42-4098-9fe0-e38b5650ecd6-mountpoint-dir\") pod \"csi-hostpathplugin-2wsmx\" (UID: \"0445dab5-5a42-4098-9fe0-e38b5650ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-2wsmx" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.432873 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/147065ae-e48c-4712-8d72-016b2a692da8-srv-cert\") pod \"olm-operator-6b444d44fb-skgqs\" (UID: \"147065ae-e48c-4712-8d72-016b2a692da8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-skgqs" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.432905 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8c1c0513-5d12-463c-b66d-c1a5d02eafc0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-srlsw\" (UID: \"8c1c0513-5d12-463c-b66d-c1a5d02eafc0\") " pod="openshift-marketplace/marketplace-operator-79b997595-srlsw" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.432921 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0445dab5-5a42-4098-9fe0-e38b5650ecd6-registration-dir\") pod \"csi-hostpathplugin-2wsmx\" (UID: \"0445dab5-5a42-4098-9fe0-e38b5650ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-2wsmx" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.432938 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fswsp\" (UniqueName: \"kubernetes.io/projected/c1b1ecba-4701-4e11-b2e9-1327cc2c5c2d-kube-api-access-fswsp\") pod \"ingress-canary-89w6m\" (UID: \"c1b1ecba-4701-4e11-b2e9-1327cc2c5c2d\") " pod="openshift-ingress-canary/ingress-canary-89w6m" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.432960 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl7qg\" (UniqueName: \"kubernetes.io/projected/dc25bf0a-f5e4-4f6d-9442-6b55a043f751-kube-api-access-zl7qg\") pod \"control-plane-machine-set-operator-78cbb6b69f-rsbss\" (UID: \"dc25bf0a-f5e4-4f6d-9442-6b55a043f751\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rsbss" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.432979 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d98a1c63-4a49-437d-97a3-c9c37efce476-apiservice-cert\") pod \"packageserver-d55dfcdfc-7trl6\" (UID: \"d98a1c63-4a49-437d-97a3-c9c37efce476\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7trl6" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.432995 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/33ad6890-1e95-4a46-9f40-00cf0443d4a1-default-certificate\") pod \"router-default-5444994796-pldb7\" (UID: \"33ad6890-1e95-4a46-9f40-00cf0443d4a1\") " pod="openshift-ingress/router-default-5444994796-pldb7" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433010 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33ad6890-1e95-4a46-9f40-00cf0443d4a1-metrics-certs\") pod \"router-default-5444994796-pldb7\" (UID: \"33ad6890-1e95-4a46-9f40-00cf0443d4a1\") " pod="openshift-ingress/router-default-5444994796-pldb7" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433024 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d98a1c63-4a49-437d-97a3-c9c37efce476-webhook-cert\") pod \"packageserver-d55dfcdfc-7trl6\" (UID: \"d98a1c63-4a49-437d-97a3-c9c37efce476\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7trl6" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433037 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wj6jr\" (UniqueName: \"kubernetes.io/projected/bc98cfae-5f65-416a-ba4b-24f601ff6209-kube-api-access-wj6jr\") pod \"machine-config-server-whvjt\" (UID: \"bc98cfae-5f65-416a-ba4b-24f601ff6209\") " pod="openshift-machine-config-operator/machine-config-server-whvjt" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433070 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0445dab5-5a42-4098-9fe0-e38b5650ecd6-plugins-dir\") pod \"csi-hostpathplugin-2wsmx\" (UID: \"0445dab5-5a42-4098-9fe0-e38b5650ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-2wsmx" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433085 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2267e449-66c1-4699-897b-9851e54d2645-config-volume\") pod \"dns-default-9l7sb\" (UID: \"2267e449-66c1-4699-897b-9851e54d2645\") " pod="openshift-dns/dns-default-9l7sb" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433098 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs7v4\" (UniqueName: \"kubernetes.io/projected/468e155a-66ab-4823-b737-7e1f9c78ac3f-kube-api-access-gs7v4\") pod \"package-server-manager-789f6589d5-wt67p\" (UID: \"468e155a-66ab-4823-b737-7e1f9c78ac3f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wt67p" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433119 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c03f49e6-2884-451b-af8c-9d47555d1413-signing-key\") pod \"service-ca-9c57cc56f-4tvxr\" (UID: \"c03f49e6-2884-451b-af8c-9d47555d1413\") " pod="openshift-service-ca/service-ca-9c57cc56f-4tvxr" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433134 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a77db5eb-461b-49d7-a2d7-91be5bb52a09-srv-cert\") pod \"catalog-operator-68c6474976-gt6wf\" (UID: \"a77db5eb-461b-49d7-a2d7-91be5bb52a09\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gt6wf" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433147 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h2lr\" (UniqueName: \"kubernetes.io/projected/c03f49e6-2884-451b-af8c-9d47555d1413-kube-api-access-7h2lr\") pod \"service-ca-9c57cc56f-4tvxr\" (UID: \"c03f49e6-2884-451b-af8c-9d47555d1413\") " pod="openshift-service-ca/service-ca-9c57cc56f-4tvxr" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433160 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/895674c7-2ae3-4c93-8fcf-a2e1b7f2dfcb-proxy-tls\") pod \"machine-config-controller-84d6567774-hgkns\" (UID: \"895674c7-2ae3-4c93-8fcf-a2e1b7f2dfcb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hgkns" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433178 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/468e155a-66ab-4823-b737-7e1f9c78ac3f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-wt67p\" (UID: \"468e155a-66ab-4823-b737-7e1f9c78ac3f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wt67p" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433199 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/895674c7-2ae3-4c93-8fcf-a2e1b7f2dfcb-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hgkns\" (UID: \"895674c7-2ae3-4c93-8fcf-a2e1b7f2dfcb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hgkns" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433213 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68q49\" (UniqueName: \"kubernetes.io/projected/147065ae-e48c-4712-8d72-016b2a692da8-kube-api-access-68q49\") pod \"olm-operator-6b444d44fb-skgqs\" (UID: \"147065ae-e48c-4712-8d72-016b2a692da8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-skgqs" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433226 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d73470df-6b49-4a9f-bcd8-a0acac5cb921-auth-proxy-config\") pod \"machine-config-operator-74547568cd-r9tjc\" (UID: \"d73470df-6b49-4a9f-bcd8-a0acac5cb921\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9tjc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433265 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df2gb\" (UniqueName: \"kubernetes.io/projected/33ad6890-1e95-4a46-9f40-00cf0443d4a1-kube-api-access-df2gb\") pod \"router-default-5444994796-pldb7\" (UID: \"33ad6890-1e95-4a46-9f40-00cf0443d4a1\") " pod="openshift-ingress/router-default-5444994796-pldb7" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433290 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4dds\" (UniqueName: \"kubernetes.io/projected/8c1c0513-5d12-463c-b66d-c1a5d02eafc0-kube-api-access-z4dds\") pod \"marketplace-operator-79b997595-srlsw\" (UID: \"8c1c0513-5d12-463c-b66d-c1a5d02eafc0\") " pod="openshift-marketplace/marketplace-operator-79b997595-srlsw" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433323 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxzb9\" (UniqueName: \"kubernetes.io/projected/2267e449-66c1-4699-897b-9851e54d2645-kube-api-access-jxzb9\") pod \"dns-default-9l7sb\" (UID: \"2267e449-66c1-4699-897b-9851e54d2645\") " pod="openshift-dns/dns-default-9l7sb" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433343 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0445dab5-5a42-4098-9fe0-e38b5650ecd6-csi-data-dir\") pod \"csi-hostpathplugin-2wsmx\" (UID: \"0445dab5-5a42-4098-9fe0-e38b5650ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-2wsmx" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433357 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d73470df-6b49-4a9f-bcd8-a0acac5cb921-images\") pod \"machine-config-operator-74547568cd-r9tjc\" (UID: \"d73470df-6b49-4a9f-bcd8-a0acac5cb921\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9tjc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433372 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbxs9\" (UniqueName: \"kubernetes.io/projected/895674c7-2ae3-4c93-8fcf-a2e1b7f2dfcb-kube-api-access-mbxs9\") pod \"machine-config-controller-84d6567774-hgkns\" (UID: \"895674c7-2ae3-4c93-8fcf-a2e1b7f2dfcb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hgkns" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433385 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a77db5eb-461b-49d7-a2d7-91be5bb52a09-profile-collector-cert\") pod \"catalog-operator-68c6474976-gt6wf\" (UID: \"a77db5eb-461b-49d7-a2d7-91be5bb52a09\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gt6wf" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433410 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d98a1c63-4a49-437d-97a3-c9c37efce476-tmpfs\") pod \"packageserver-d55dfcdfc-7trl6\" (UID: \"d98a1c63-4a49-437d-97a3-c9c37efce476\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7trl6" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433427 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrkvg\" (UniqueName: \"kubernetes.io/projected/d73470df-6b49-4a9f-bcd8-a0acac5cb921-kube-api-access-vrkvg\") pod \"machine-config-operator-74547568cd-r9tjc\" (UID: \"d73470df-6b49-4a9f-bcd8-a0acac5cb921\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9tjc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433467 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvf7f\" (UniqueName: \"kubernetes.io/projected/dd026b3b-b5df-4f5a-81c8-2bf0b6b8c766-kube-api-access-zvf7f\") pod \"migrator-59844c95c7-89bvk\" (UID: \"dd026b3b-b5df-4f5a-81c8-2bf0b6b8c766\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-89bvk" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433483 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c1c0513-5d12-463c-b66d-c1a5d02eafc0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-srlsw\" (UID: \"8c1c0513-5d12-463c-b66d-c1a5d02eafc0\") " pod="openshift-marketplace/marketplace-operator-79b997595-srlsw" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433497 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c03f49e6-2884-451b-af8c-9d47555d1413-signing-cabundle\") pod \"service-ca-9c57cc56f-4tvxr\" (UID: \"c03f49e6-2884-451b-af8c-9d47555d1413\") " pod="openshift-service-ca/service-ca-9c57cc56f-4tvxr" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433509 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0445dab5-5a42-4098-9fe0-e38b5650ecd6-socket-dir\") pod \"csi-hostpathplugin-2wsmx\" (UID: \"0445dab5-5a42-4098-9fe0-e38b5650ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-2wsmx" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433522 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2267e449-66c1-4699-897b-9851e54d2645-metrics-tls\") pod \"dns-default-9l7sb\" (UID: \"2267e449-66c1-4699-897b-9851e54d2645\") " pod="openshift-dns/dns-default-9l7sb" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433546 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/bc98cfae-5f65-416a-ba4b-24f601ff6209-node-bootstrap-token\") pod \"machine-config-server-whvjt\" (UID: \"bc98cfae-5f65-416a-ba4b-24f601ff6209\") " pod="openshift-machine-config-operator/machine-config-server-whvjt" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433564 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbhqh\" (UniqueName: \"kubernetes.io/projected/3936549f-6055-4b0d-af18-7958d91c672f-kube-api-access-wbhqh\") pod \"collect-profiles-29421540-hhxsf\" (UID: \"3936549f-6055-4b0d-af18-7958d91c672f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-hhxsf" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433580 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3936549f-6055-4b0d-af18-7958d91c672f-config-volume\") pod \"collect-profiles-29421540-hhxsf\" (UID: \"3936549f-6055-4b0d-af18-7958d91c672f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-hhxsf" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433595 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d73470df-6b49-4a9f-bcd8-a0acac5cb921-proxy-tls\") pod \"machine-config-operator-74547568cd-r9tjc\" (UID: \"d73470df-6b49-4a9f-bcd8-a0acac5cb921\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9tjc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433614 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1908bc6b-523d-4ffd-8e4c-e73f3e3b7ee3-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-msfrj\" (UID: \"1908bc6b-523d-4ffd-8e4c-e73f3e3b7ee3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-msfrj" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433626 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33ad6890-1e95-4a46-9f40-00cf0443d4a1-service-ca-bundle\") pod \"router-default-5444994796-pldb7\" (UID: \"33ad6890-1e95-4a46-9f40-00cf0443d4a1\") " pod="openshift-ingress/router-default-5444994796-pldb7" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433650 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1b1ecba-4701-4e11-b2e9-1327cc2c5c2d-cert\") pod \"ingress-canary-89w6m\" (UID: \"c1b1ecba-4701-4e11-b2e9-1327cc2c5c2d\") " pod="openshift-ingress-canary/ingress-canary-89w6m" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433664 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzpv7\" (UniqueName: \"kubernetes.io/projected/1908bc6b-523d-4ffd-8e4c-e73f3e3b7ee3-kube-api-access-rzpv7\") pod \"multus-admission-controller-857f4d67dd-msfrj\" (UID: \"1908bc6b-523d-4ffd-8e4c-e73f3e3b7ee3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-msfrj" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433689 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqnnj\" (UniqueName: \"kubernetes.io/projected/d98a1c63-4a49-437d-97a3-c9c37efce476-kube-api-access-nqnnj\") pod \"packageserver-d55dfcdfc-7trl6\" (UID: \"d98a1c63-4a49-437d-97a3-c9c37efce476\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7trl6" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433702 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/33ad6890-1e95-4a46-9f40-00cf0443d4a1-stats-auth\") pod \"router-default-5444994796-pldb7\" (UID: \"33ad6890-1e95-4a46-9f40-00cf0443d4a1\") " pod="openshift-ingress/router-default-5444994796-pldb7" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433716 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc25bf0a-f5e4-4f6d-9442-6b55a043f751-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rsbss\" (UID: \"dc25bf0a-f5e4-4f6d-9442-6b55a043f751\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rsbss" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433729 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/bc98cfae-5f65-416a-ba4b-24f601ff6209-certs\") pod \"machine-config-server-whvjt\" (UID: \"bc98cfae-5f65-416a-ba4b-24f601ff6209\") " pod="openshift-machine-config-operator/machine-config-server-whvjt" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433742 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxkkl\" (UniqueName: \"kubernetes.io/projected/a77db5eb-461b-49d7-a2d7-91be5bb52a09-kube-api-access-fxkkl\") pod \"catalog-operator-68c6474976-gt6wf\" (UID: \"a77db5eb-461b-49d7-a2d7-91be5bb52a09\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gt6wf" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433757 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldmfx\" (UniqueName: \"kubernetes.io/projected/0445dab5-5a42-4098-9fe0-e38b5650ecd6-kube-api-access-ldmfx\") pod \"csi-hostpathplugin-2wsmx\" (UID: \"0445dab5-5a42-4098-9fe0-e38b5650ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-2wsmx" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433774 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3936549f-6055-4b0d-af18-7958d91c672f-secret-volume\") pod \"collect-profiles-29421540-hhxsf\" (UID: \"3936549f-6055-4b0d-af18-7958d91c672f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-hhxsf" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.433788 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/147065ae-e48c-4712-8d72-016b2a692da8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-skgqs\" (UID: \"147065ae-e48c-4712-8d72-016b2a692da8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-skgqs" Dec 09 15:00:25 crc kubenswrapper[4555]: E1209 15:00:25.434465 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:25.934429595 +0000 UTC m=+136.071142313 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.435482 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/147065ae-e48c-4712-8d72-016b2a692da8-srv-cert\") pod \"olm-operator-6b444d44fb-skgqs\" (UID: \"147065ae-e48c-4712-8d72-016b2a692da8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-skgqs" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.435629 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0445dab5-5a42-4098-9fe0-e38b5650ecd6-mountpoint-dir\") pod \"csi-hostpathplugin-2wsmx\" (UID: \"0445dab5-5a42-4098-9fe0-e38b5650ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-2wsmx" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.435743 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0445dab5-5a42-4098-9fe0-e38b5650ecd6-registration-dir\") pod \"csi-hostpathplugin-2wsmx\" (UID: \"0445dab5-5a42-4098-9fe0-e38b5650ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-2wsmx" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.436430 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c03f49e6-2884-451b-af8c-9d47555d1413-signing-cabundle\") pod \"service-ca-9c57cc56f-4tvxr\" (UID: \"c03f49e6-2884-451b-af8c-9d47555d1413\") " pod="openshift-service-ca/service-ca-9c57cc56f-4tvxr" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.436526 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8c1c0513-5d12-463c-b66d-c1a5d02eafc0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-srlsw\" (UID: \"8c1c0513-5d12-463c-b66d-c1a5d02eafc0\") " pod="openshift-marketplace/marketplace-operator-79b997595-srlsw" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.437212 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c1c0513-5d12-463c-b66d-c1a5d02eafc0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-srlsw\" (UID: \"8c1c0513-5d12-463c-b66d-c1a5d02eafc0\") " pod="openshift-marketplace/marketplace-operator-79b997595-srlsw" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.437817 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0445dab5-5a42-4098-9fe0-e38b5650ecd6-plugins-dir\") pod \"csi-hostpathplugin-2wsmx\" (UID: \"0445dab5-5a42-4098-9fe0-e38b5650ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-2wsmx" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.438273 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d98a1c63-4a49-437d-97a3-c9c37efce476-apiservice-cert\") pod \"packageserver-d55dfcdfc-7trl6\" (UID: \"d98a1c63-4a49-437d-97a3-c9c37efce476\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7trl6" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.439229 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mk586" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.439396 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2267e449-66c1-4699-897b-9851e54d2645-config-volume\") pod \"dns-default-9l7sb\" (UID: \"2267e449-66c1-4699-897b-9851e54d2645\") " pod="openshift-dns/dns-default-9l7sb" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.439971 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33ad6890-1e95-4a46-9f40-00cf0443d4a1-service-ca-bundle\") pod \"router-default-5444994796-pldb7\" (UID: \"33ad6890-1e95-4a46-9f40-00cf0443d4a1\") " pod="openshift-ingress/router-default-5444994796-pldb7" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.440443 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33ad6890-1e95-4a46-9f40-00cf0443d4a1-metrics-certs\") pod \"router-default-5444994796-pldb7\" (UID: \"33ad6890-1e95-4a46-9f40-00cf0443d4a1\") " pod="openshift-ingress/router-default-5444994796-pldb7" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.440626 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/895674c7-2ae3-4c93-8fcf-a2e1b7f2dfcb-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hgkns\" (UID: \"895674c7-2ae3-4c93-8fcf-a2e1b7f2dfcb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hgkns" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.440810 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/147065ae-e48c-4712-8d72-016b2a692da8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-skgqs\" (UID: \"147065ae-e48c-4712-8d72-016b2a692da8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-skgqs" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.440938 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/bc98cfae-5f65-416a-ba4b-24f601ff6209-certs\") pod \"machine-config-server-whvjt\" (UID: \"bc98cfae-5f65-416a-ba4b-24f601ff6209\") " pod="openshift-machine-config-operator/machine-config-server-whvjt" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.441000 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d98a1c63-4a49-437d-97a3-c9c37efce476-tmpfs\") pod \"packageserver-d55dfcdfc-7trl6\" (UID: \"d98a1c63-4a49-437d-97a3-c9c37efce476\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7trl6" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.441213 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/33ad6890-1e95-4a46-9f40-00cf0443d4a1-stats-auth\") pod \"router-default-5444994796-pldb7\" (UID: \"33ad6890-1e95-4a46-9f40-00cf0443d4a1\") " pod="openshift-ingress/router-default-5444994796-pldb7" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.441275 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0445dab5-5a42-4098-9fe0-e38b5650ecd6-socket-dir\") pod \"csi-hostpathplugin-2wsmx\" (UID: \"0445dab5-5a42-4098-9fe0-e38b5650ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-2wsmx" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.441339 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1b1ecba-4701-4e11-b2e9-1327cc2c5c2d-cert\") pod \"ingress-canary-89w6m\" (UID: \"c1b1ecba-4701-4e11-b2e9-1327cc2c5c2d\") " pod="openshift-ingress-canary/ingress-canary-89w6m" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.441656 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d98a1c63-4a49-437d-97a3-c9c37efce476-webhook-cert\") pod \"packageserver-d55dfcdfc-7trl6\" (UID: \"d98a1c63-4a49-437d-97a3-c9c37efce476\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7trl6" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.441759 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d73470df-6b49-4a9f-bcd8-a0acac5cb921-proxy-tls\") pod \"machine-config-operator-74547568cd-r9tjc\" (UID: \"d73470df-6b49-4a9f-bcd8-a0acac5cb921\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9tjc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.441863 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0445dab5-5a42-4098-9fe0-e38b5650ecd6-csi-data-dir\") pod \"csi-hostpathplugin-2wsmx\" (UID: \"0445dab5-5a42-4098-9fe0-e38b5650ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-2wsmx" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.441923 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d73470df-6b49-4a9f-bcd8-a0acac5cb921-auth-proxy-config\") pod \"machine-config-operator-74547568cd-r9tjc\" (UID: \"d73470df-6b49-4a9f-bcd8-a0acac5cb921\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9tjc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.441995 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/33ad6890-1e95-4a46-9f40-00cf0443d4a1-default-certificate\") pod \"router-default-5444994796-pldb7\" (UID: \"33ad6890-1e95-4a46-9f40-00cf0443d4a1\") " pod="openshift-ingress/router-default-5444994796-pldb7" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.442061 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3936549f-6055-4b0d-af18-7958d91c672f-config-volume\") pod \"collect-profiles-29421540-hhxsf\" (UID: \"3936549f-6055-4b0d-af18-7958d91c672f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-hhxsf" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.442679 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c03f49e6-2884-451b-af8c-9d47555d1413-signing-key\") pod \"service-ca-9c57cc56f-4tvxr\" (UID: \"c03f49e6-2884-451b-af8c-9d47555d1413\") " pod="openshift-service-ca/service-ca-9c57cc56f-4tvxr" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.442929 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d73470df-6b49-4a9f-bcd8-a0acac5cb921-images\") pod \"machine-config-operator-74547568cd-r9tjc\" (UID: \"d73470df-6b49-4a9f-bcd8-a0acac5cb921\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9tjc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.444062 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/468e155a-66ab-4823-b737-7e1f9c78ac3f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-wt67p\" (UID: \"468e155a-66ab-4823-b737-7e1f9c78ac3f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wt67p" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.444076 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1908bc6b-523d-4ffd-8e4c-e73f3e3b7ee3-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-msfrj\" (UID: \"1908bc6b-523d-4ffd-8e4c-e73f3e3b7ee3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-msfrj" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.444089 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2267e449-66c1-4699-897b-9851e54d2645-metrics-tls\") pod \"dns-default-9l7sb\" (UID: \"2267e449-66c1-4699-897b-9851e54d2645\") " pod="openshift-dns/dns-default-9l7sb" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.444422 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a77db5eb-461b-49d7-a2d7-91be5bb52a09-profile-collector-cert\") pod \"catalog-operator-68c6474976-gt6wf\" (UID: \"a77db5eb-461b-49d7-a2d7-91be5bb52a09\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gt6wf" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.444621 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3936549f-6055-4b0d-af18-7958d91c672f-secret-volume\") pod \"collect-profiles-29421540-hhxsf\" (UID: \"3936549f-6055-4b0d-af18-7958d91c672f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-hhxsf" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.444994 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/bc98cfae-5f65-416a-ba4b-24f601ff6209-node-bootstrap-token\") pod \"machine-config-server-whvjt\" (UID: \"bc98cfae-5f65-416a-ba4b-24f601ff6209\") " pod="openshift-machine-config-operator/machine-config-server-whvjt" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.445090 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc25bf0a-f5e4-4f6d-9442-6b55a043f751-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rsbss\" (UID: \"dc25bf0a-f5e4-4f6d-9442-6b55a043f751\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rsbss" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.445366 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a77db5eb-461b-49d7-a2d7-91be5bb52a09-srv-cert\") pod \"catalog-operator-68c6474976-gt6wf\" (UID: \"a77db5eb-461b-49d7-a2d7-91be5bb52a09\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gt6wf" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.445972 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/895674c7-2ae3-4c93-8fcf-a2e1b7f2dfcb-proxy-tls\") pod \"machine-config-controller-84d6567774-hgkns\" (UID: \"895674c7-2ae3-4c93-8fcf-a2e1b7f2dfcb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hgkns" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.447402 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/897e7b14-cb3a-40e7-a127-587bc5e9502a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gkckz\" (UID: \"897e7b14-cb3a-40e7-a127-587bc5e9502a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gkckz" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.465692 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jjbm\" (UniqueName: \"kubernetes.io/projected/13cf2230-0cbf-49b1-92f4-b49b8d72acec-kube-api-access-6jjbm\") pod \"apiserver-7bbb656c7d-5xggk\" (UID: \"13cf2230-0cbf-49b1-92f4-b49b8d72acec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.482326 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-nqlbc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.486442 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9krkk\" (UniqueName: \"kubernetes.io/projected/6f6a1efb-6c81-45a4-91e2-9f67f8a266be-kube-api-access-9krkk\") pod \"service-ca-operator-777779d784-b27rg\" (UID: \"6f6a1efb-6c81-45a4-91e2-9f67f8a266be\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b27rg" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.494275 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5ncnf"] Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.508501 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnh65\" (UniqueName: \"kubernetes.io/projected/0a4f7efd-8434-48a7-979e-9f128e93797b-kube-api-access-nnh65\") pod \"cluster-samples-operator-665b6dd947-jxl9b\" (UID: \"0a4f7efd-8434-48a7-979e-9f128e93797b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jxl9b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.510201 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vhncm" Dec 09 15:00:25 crc kubenswrapper[4555]: W1209 15:00:25.518665 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc16a9d27_0e19_4e91_ad2b_0a5e3948e19c.slice/crio-c2b9cc15538c2e8d581dff6118c22f3c3453eebb681664b720f712c6ee6621e2 WatchSource:0}: Error finding container c2b9cc15538c2e8d581dff6118c22f3c3453eebb681664b720f712c6ee6621e2: Status 404 returned error can't find the container with id c2b9cc15538c2e8d581dff6118c22f3c3453eebb681664b720f712c6ee6621e2 Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.524886 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx7h6\" (UniqueName: \"kubernetes.io/projected/d3f2ca3d-d5ad-4a22-96d4-3bc4c5eb7a96-kube-api-access-sx7h6\") pod \"ingress-operator-5b745b69d9-2f6gc\" (UID: \"d3f2ca3d-d5ad-4a22-96d4-3bc4c5eb7a96\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f6gc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.532523 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gkckz" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.534708 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:25 crc kubenswrapper[4555]: E1209 15:00:25.535008 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:26.034996248 +0000 UTC m=+136.171708966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.543266 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.546406 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/adaadc75-0b7d-4699-8d61-fe5ebb7dc84a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-9xbb6\" (UID: \"adaadc75-0b7d-4699-8d61-fe5ebb7dc84a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xbb6" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.553820 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f6gc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.567602 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv9fc\" (UniqueName: \"kubernetes.io/projected/7ce81b97-c663-47e5-b75e-ba3333d5cb9a-kube-api-access-nv9fc\") pod \"machine-approver-56656f9798-8fb58\" (UID: \"7ce81b97-c663-47e5-b75e-ba3333d5cb9a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8fb58" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.570179 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b27rg" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.582352 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-vbr87" event={"ID":"f86821ee-dfd8-4090-bd6c-8f624a1362ae","Type":"ContainerStarted","Data":"8dcb8783f8c9f94a27b9840ab4ddc426cc4ddd9db8262c9dd5c2c3e801382c83"} Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.582405 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-vbr87" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.582417 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-vbr87" event={"ID":"f86821ee-dfd8-4090-bd6c-8f624a1362ae","Type":"ContainerStarted","Data":"b77f95d494f228666f8790acd98fd9284669a6612ee5c458b2667802235ba435"} Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.582442 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mk586"] Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.588280 4555 patch_prober.go:28] interesting pod/downloads-7954f5f757-vbr87 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.588326 4555 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vbr87" podUID="f86821ee-dfd8-4090-bd6c-8f624a1362ae" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.588350 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn2kc\" (UniqueName: \"kubernetes.io/projected/eee41022-2a9c-41c8-bbc8-ff42874c4050-kube-api-access-dn2kc\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.591123 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7h6sb" event={"ID":"61f20e90-b252-48ed-978a-8aec4b3d98ba","Type":"ContainerStarted","Data":"7e153daf0c1e2b34aad75bf5968f6f7f7a6cc6a64296db8edfbfc5b05da64e0d"} Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.591161 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7h6sb" event={"ID":"61f20e90-b252-48ed-978a-8aec4b3d98ba","Type":"ContainerStarted","Data":"72af49ec40c6496f55ef9288eccb1aef70679e26c7a012c89574cc4c8df5fcff"} Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.594708 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-llvq5" event={"ID":"86d9973d-84c0-4a1b-9e86-7609a79e4bfa","Type":"ContainerStarted","Data":"cccacfe2969319cce6b86d9dd76bc09ac81c9f775f640fe9cee196ee6e3db9d9"} Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.594728 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-llvq5" event={"ID":"86d9973d-84c0-4a1b-9e86-7609a79e4bfa","Type":"ContainerStarted","Data":"d3e20bb882c5f59ec3063af50634885572fc2ed5d2e543d39021d88fd3c2fc84"} Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.595986 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6" event={"ID":"d0f5603f-8e68-4cfb-9aa1-a6854b684172","Type":"ContainerStarted","Data":"05fee7b2bfa9af8f8109fe4522cf82cee4fa3aea171cbb4d623eb4ea0d81f4fa"} Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.596010 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6" event={"ID":"d0f5603f-8e68-4cfb-9aa1-a6854b684172","Type":"ContainerStarted","Data":"2e194e8279b55b50ed2f0451d6c3122ce427b731c639a95d8944879b378b1e91"} Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.596537 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.597807 4555 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-qdpr6 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.597856 4555 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6" podUID="d0f5603f-8e68-4cfb-9aa1-a6854b684172" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.598128 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5ncnf" event={"ID":"c16a9d27-0e19-4e91-ad2b-0a5e3948e19c","Type":"ContainerStarted","Data":"c2b9cc15538c2e8d581dff6118c22f3c3453eebb681664b720f712c6ee6621e2"} Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.618297 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8jwt\" (UniqueName: \"kubernetes.io/projected/77dac49a-34a8-421c-99ab-1fa4a08fc6fa-kube-api-access-p8jwt\") pod \"openshift-config-operator-7777fb866f-mtdft\" (UID: \"77dac49a-34a8-421c-99ab-1fa4a08fc6fa\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtdft" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.620621 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-rf8rn" event={"ID":"dae2070b-58ae-47b7-8609-3ca457ed809a","Type":"ContainerStarted","Data":"28da5b30abf7efc9e301eca9bc2d4566e6101847965e922da990de0153d47328"} Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.620667 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-rf8rn" event={"ID":"dae2070b-58ae-47b7-8609-3ca457ed809a","Type":"ContainerStarted","Data":"df8b2176d8c6d2cf4e427ada2bfdcf8a050a4d35e607d5589965653a17529715"} Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.621386 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-rf8rn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.625277 4555 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-rf8rn container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.625395 4555 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-rf8rn" podUID="dae2070b-58ae-47b7-8609-3ca457ed809a" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.633007 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmt28\" (UniqueName: \"kubernetes.io/projected/adaadc75-0b7d-4699-8d61-fe5ebb7dc84a-kube-api-access-gmt28\") pod \"cluster-image-registry-operator-dc59b4c8b-9xbb6\" (UID: \"adaadc75-0b7d-4699-8d61-fe5ebb7dc84a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xbb6" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.635657 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:25 crc kubenswrapper[4555]: E1209 15:00:25.636027 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:26.136005318 +0000 UTC m=+136.272718036 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.645671 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwzpz\" (UniqueName: \"kubernetes.io/projected/5446f927-f240-4554-832a-28b950424eea-kube-api-access-qwzpz\") pod \"dns-operator-744455d44c-srtkn\" (UID: \"5446f927-f240-4554-832a-28b950424eea\") " pod="openshift-dns-operator/dns-operator-744455d44c-srtkn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.674646 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbl9s\" (UniqueName: \"kubernetes.io/projected/b5a1d5a2-358f-4445-9253-5a9bc8b9daf3-kube-api-access-xbl9s\") pod \"openshift-controller-manager-operator-756b6f6bc6-wzp94\" (UID: \"b5a1d5a2-358f-4445-9253-5a9bc8b9daf3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzp94" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.688792 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnspm\" (UniqueName: \"kubernetes.io/projected/5926dc72-d5e9-4c49-afba-ac1ce69678be-kube-api-access-dnspm\") pod \"console-operator-58897d9998-7drj7\" (UID: \"5926dc72-d5e9-4c49-afba-ac1ce69678be\") " pod="openshift-console-operator/console-operator-58897d9998-7drj7" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.689429 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gkckz"] Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.690131 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8fb58" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.707097 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e376f745-76d4-42dd-97f7-4554c02eec74-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-pw6lj\" (UID: \"e376f745-76d4-42dd-97f7-4554c02eec74\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pw6lj" Dec 09 15:00:25 crc kubenswrapper[4555]: W1209 15:00:25.707678 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod897e7b14_cb3a_40e7_a127_587bc5e9502a.slice/crio-b34ac9407743c47f234a5603eb8de26c8dce2dd3f3b17bb6abc1799892b055a2 WatchSource:0}: Error finding container b34ac9407743c47f234a5603eb8de26c8dce2dd3f3b17bb6abc1799892b055a2: Status 404 returned error can't find the container with id b34ac9407743c47f234a5603eb8de26c8dce2dd3f3b17bb6abc1799892b055a2 Dec 09 15:00:25 crc kubenswrapper[4555]: W1209 15:00:25.714404 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ce81b97_c663_47e5_b75e_ba3333d5cb9a.slice/crio-bad32cbce265f9372a5dc94466e78fc478d6de0606efca168f3ce9f6b486ccf2 WatchSource:0}: Error finding container bad32cbce265f9372a5dc94466e78fc478d6de0606efca168f3ce9f6b486ccf2: Status 404 returned error can't find the container with id bad32cbce265f9372a5dc94466e78fc478d6de0606efca168f3ce9f6b486ccf2 Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.725939 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eee41022-2a9c-41c8-bbc8-ff42874c4050-bound-sa-token\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.736817 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:25 crc kubenswrapper[4555]: E1209 15:00:25.737983 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:26.237968976 +0000 UTC m=+136.374681696 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.738448 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.745577 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpcg2\" (UniqueName: \"kubernetes.io/projected/b7402585-182c-46a1-a8ac-257c97421c8d-kube-api-access-cpcg2\") pod \"apiserver-76f77b778f-xdwlv\" (UID: \"b7402585-182c-46a1-a8ac-257c97421c8d\") " pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.757597 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.766725 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6698v\" (UniqueName: \"kubernetes.io/projected/2d15dddf-4a72-4590-9def-755a9fd11da6-kube-api-access-6698v\") pod \"authentication-operator-69f744f599-5kplw\" (UID: \"2d15dddf-4a72-4590-9def-755a9fd11da6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5kplw" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.777011 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jxl9b" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.802498 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-5kplw" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.804517 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtdft" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.806483 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fswsp\" (UniqueName: \"kubernetes.io/projected/c1b1ecba-4701-4e11-b2e9-1327cc2c5c2d-kube-api-access-fswsp\") pod \"ingress-canary-89w6m\" (UID: \"c1b1ecba-4701-4e11-b2e9-1327cc2c5c2d\") " pod="openshift-ingress-canary/ingress-canary-89w6m" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.823385 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-srtkn" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.823407 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-7drj7" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.829260 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbxs9\" (UniqueName: \"kubernetes.io/projected/895674c7-2ae3-4c93-8fcf-a2e1b7f2dfcb-kube-api-access-mbxs9\") pod \"machine-config-controller-84d6567774-hgkns\" (UID: \"895674c7-2ae3-4c93-8fcf-a2e1b7f2dfcb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hgkns" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.840182 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pw6lj" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.840999 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:25 crc kubenswrapper[4555]: E1209 15:00:25.841120 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:26.341104716 +0000 UTC m=+136.477817434 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.845256 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:25 crc kubenswrapper[4555]: E1209 15:00:25.845761 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:26.34575323 +0000 UTC m=+136.482465949 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.848410 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzp94" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.851117 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl7qg\" (UniqueName: \"kubernetes.io/projected/dc25bf0a-f5e4-4f6d-9442-6b55a043f751-kube-api-access-zl7qg\") pod \"control-plane-machine-set-operator-78cbb6b69f-rsbss\" (UID: \"dc25bf0a-f5e4-4f6d-9442-6b55a043f751\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rsbss" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.860762 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xbb6" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.865549 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrkvg\" (UniqueName: \"kubernetes.io/projected/d73470df-6b49-4a9f-bcd8-a0acac5cb921-kube-api-access-vrkvg\") pod \"machine-config-operator-74547568cd-r9tjc\" (UID: \"d73470df-6b49-4a9f-bcd8-a0acac5cb921\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9tjc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.872163 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-b27rg"] Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.879130 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nqlbc"] Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.887068 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hgkns" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.891239 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvf7f\" (UniqueName: \"kubernetes.io/projected/dd026b3b-b5df-4f5a-81c8-2bf0b6b8c766-kube-api-access-zvf7f\") pod \"migrator-59844c95c7-89bvk\" (UID: \"dd026b3b-b5df-4f5a-81c8-2bf0b6b8c766\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-89bvk" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.912895 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxkkl\" (UniqueName: \"kubernetes.io/projected/a77db5eb-461b-49d7-a2d7-91be5bb52a09-kube-api-access-fxkkl\") pod \"catalog-operator-68c6474976-gt6wf\" (UID: \"a77db5eb-461b-49d7-a2d7-91be5bb52a09\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gt6wf" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.939842 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9tjc" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.941762 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldmfx\" (UniqueName: \"kubernetes.io/projected/0445dab5-5a42-4098-9fe0-e38b5650ecd6-kube-api-access-ldmfx\") pod \"csi-hostpathplugin-2wsmx\" (UID: \"0445dab5-5a42-4098-9fe0-e38b5650ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-2wsmx" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.948128 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:25 crc kubenswrapper[4555]: E1209 15:00:25.948447 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:26.448434849 +0000 UTC m=+136.585147567 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:25 crc kubenswrapper[4555]: W1209 15:00:25.952421 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode40fc0c2_28f1_444d_8ff4_8f11faa1db2b.slice/crio-bda8f71c58d373adb9972866e249b5c82d6660f874f3764bccdfe1d4c10f4e5a WatchSource:0}: Error finding container bda8f71c58d373adb9972866e249b5c82d6660f874f3764bccdfe1d4c10f4e5a: Status 404 returned error can't find the container with id bda8f71c58d373adb9972866e249b5c82d6660f874f3764bccdfe1d4c10f4e5a Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.959286 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-89w6m" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.959741 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj6jr\" (UniqueName: \"kubernetes.io/projected/bc98cfae-5f65-416a-ba4b-24f601ff6209-kube-api-access-wj6jr\") pod \"machine-config-server-whvjt\" (UID: \"bc98cfae-5f65-416a-ba4b-24f601ff6209\") " pod="openshift-machine-config-operator/machine-config-server-whvjt" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.965281 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rsbss" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.965459 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk"] Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.974021 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-whvjt" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.976074 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbhqh\" (UniqueName: \"kubernetes.io/projected/3936549f-6055-4b0d-af18-7958d91c672f-kube-api-access-wbhqh\") pod \"collect-profiles-29421540-hhxsf\" (UID: \"3936549f-6055-4b0d-af18-7958d91c672f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-hhxsf" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.987212 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-2wsmx" Dec 09 15:00:25 crc kubenswrapper[4555]: I1209 15:00:25.999656 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-vhncm"] Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.005955 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xdwlv"] Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.005979 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzpv7\" (UniqueName: \"kubernetes.io/projected/1908bc6b-523d-4ffd-8e4c-e73f3e3b7ee3-kube-api-access-rzpv7\") pod \"multus-admission-controller-857f4d67dd-msfrj\" (UID: \"1908bc6b-523d-4ffd-8e4c-e73f3e3b7ee3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-msfrj" Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.021199 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqnnj\" (UniqueName: \"kubernetes.io/projected/d98a1c63-4a49-437d-97a3-c9c37efce476-kube-api-access-nqnnj\") pod \"packageserver-d55dfcdfc-7trl6\" (UID: \"d98a1c63-4a49-437d-97a3-c9c37efce476\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7trl6" Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.061934 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:26 crc kubenswrapper[4555]: E1209 15:00:26.062877 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:26.562313546 +0000 UTC m=+136.699026264 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.064316 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2f6gc"] Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.072894 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-v5j6b"] Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.075886 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs7v4\" (UniqueName: \"kubernetes.io/projected/468e155a-66ab-4823-b737-7e1f9c78ac3f-kube-api-access-gs7v4\") pod \"package-server-manager-789f6589d5-wt67p\" (UID: \"468e155a-66ab-4823-b737-7e1f9c78ac3f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wt67p" Dec 09 15:00:26 crc kubenswrapper[4555]: W1209 15:00:26.088027 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7402585_182c_46a1_a8ac_257c97421c8d.slice/crio-7678974e7433c04e0241b34eadbaf24c308aedaa0be2b70bc7679381145adda8 WatchSource:0}: Error finding container 7678974e7433c04e0241b34eadbaf24c308aedaa0be2b70bc7679381145adda8: Status 404 returned error can't find the container with id 7678974e7433c04e0241b34eadbaf24c308aedaa0be2b70bc7679381145adda8 Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.092869 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df2gb\" (UniqueName: \"kubernetes.io/projected/33ad6890-1e95-4a46-9f40-00cf0443d4a1-kube-api-access-df2gb\") pod \"router-default-5444994796-pldb7\" (UID: \"33ad6890-1e95-4a46-9f40-00cf0443d4a1\") " pod="openshift-ingress/router-default-5444994796-pldb7" Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.093886 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4dds\" (UniqueName: \"kubernetes.io/projected/8c1c0513-5d12-463c-b66d-c1a5d02eafc0-kube-api-access-z4dds\") pod \"marketplace-operator-79b997595-srlsw\" (UID: \"8c1c0513-5d12-463c-b66d-c1a5d02eafc0\") " pod="openshift-marketplace/marketplace-operator-79b997595-srlsw" Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.111157 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68q49\" (UniqueName: \"kubernetes.io/projected/147065ae-e48c-4712-8d72-016b2a692da8-kube-api-access-68q49\") pod \"olm-operator-6b444d44fb-skgqs\" (UID: \"147065ae-e48c-4712-8d72-016b2a692da8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-skgqs" Dec 09 15:00:26 crc kubenswrapper[4555]: W1209 15:00:26.122796 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7fc2be6_0cb9_4fad_ac58_37004264a563.slice/crio-112374ee17f88d33cbf08266528bee8cbb4629d6ff74862e0fa4b05b0d3c27ae WatchSource:0}: Error finding container 112374ee17f88d33cbf08266528bee8cbb4629d6ff74862e0fa4b05b0d3c27ae: Status 404 returned error can't find the container with id 112374ee17f88d33cbf08266528bee8cbb4629d6ff74862e0fa4b05b0d3c27ae Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.126002 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h2lr\" (UniqueName: \"kubernetes.io/projected/c03f49e6-2884-451b-af8c-9d47555d1413-kube-api-access-7h2lr\") pod \"service-ca-9c57cc56f-4tvxr\" (UID: \"c03f49e6-2884-451b-af8c-9d47555d1413\") " pod="openshift-service-ca/service-ca-9c57cc56f-4tvxr" Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.129494 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jxl9b"] Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.158708 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxzb9\" (UniqueName: \"kubernetes.io/projected/2267e449-66c1-4699-897b-9851e54d2645-kube-api-access-jxzb9\") pod \"dns-default-9l7sb\" (UID: \"2267e449-66c1-4699-897b-9851e54d2645\") " pod="openshift-dns/dns-default-9l7sb" Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.163225 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:26 crc kubenswrapper[4555]: E1209 15:00:26.164053 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:26.664034144 +0000 UTC m=+136.800746862 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.173688 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7trl6" Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.187637 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-89bvk" Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.209329 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gt6wf" Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.209449 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-srlsw" Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.215695 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wt67p" Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.221089 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-pldb7" Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.233568 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4tvxr" Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.233849 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-skgqs" Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.246952 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-hhxsf" Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.256118 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-msfrj" Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.266918 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:26 crc kubenswrapper[4555]: E1209 15:00:26.267191 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:26.767181926 +0000 UTC m=+136.903894644 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.304093 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-5kplw"] Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.304128 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mtdft"] Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.304333 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9l7sb" Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.368106 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:26 crc kubenswrapper[4555]: E1209 15:00:26.368532 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:26.868518227 +0000 UTC m=+137.005230934 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.470428 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:26 crc kubenswrapper[4555]: E1209 15:00:26.470877 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:26.970865952 +0000 UTC m=+137.107578670 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.576796 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:26 crc kubenswrapper[4555]: E1209 15:00:26.577215 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:27.077203095 +0000 UTC m=+137.213915813 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.641708 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vhncm" event={"ID":"a7fc2be6-0cb9-4fad-ac58-37004264a563","Type":"ContainerStarted","Data":"112374ee17f88d33cbf08266528bee8cbb4629d6ff74862e0fa4b05b0d3c27ae"} Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.642590 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jxl9b" event={"ID":"0a4f7efd-8434-48a7-979e-9f128e93797b","Type":"ContainerStarted","Data":"d2602f6bc2b7dba42839f99e2b2bf93df4cf2683a79b468f2e0be302e53cc527"} Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.643731 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8fb58" event={"ID":"7ce81b97-c663-47e5-b75e-ba3333d5cb9a","Type":"ContainerStarted","Data":"c4e8999437a0d6a2cee94e86e55c26dacc88d7fe6f95bdf1fc503d63f453875d"} Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.643756 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8fb58" event={"ID":"7ce81b97-c663-47e5-b75e-ba3333d5cb9a","Type":"ContainerStarted","Data":"bad32cbce265f9372a5dc94466e78fc478d6de0606efca168f3ce9f6b486ccf2"} Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.645088 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b27rg" event={"ID":"6f6a1efb-6c81-45a4-91e2-9f67f8a266be","Type":"ContainerStarted","Data":"f928e2fad4e8c8adf083bbe81d66ac07b7eb49f6949583a00993be133a646c64"} Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.649202 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-pldb7" event={"ID":"33ad6890-1e95-4a46-9f40-00cf0443d4a1","Type":"ContainerStarted","Data":"b2a3e7fcfa7e2a1faedd9f7a619e4621eacc2c596592c6c035bfa9b91ef8d715"} Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.652199 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gkckz" event={"ID":"897e7b14-cb3a-40e7-a127-587bc5e9502a","Type":"ContainerStarted","Data":"167b03db3db1e7b7ae52d292f265e6588feb1a4f2eb83762db32e40bb2c12216"} Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.652228 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gkckz" event={"ID":"897e7b14-cb3a-40e7-a127-587bc5e9502a","Type":"ContainerStarted","Data":"b34ac9407743c47f234a5603eb8de26c8dce2dd3f3b17bb6abc1799892b055a2"} Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.683387 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:26 crc kubenswrapper[4555]: E1209 15:00:26.683617 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:27.183607996 +0000 UTC m=+137.320320713 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.687695 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-5kplw" event={"ID":"2d15dddf-4a72-4590-9def-755a9fd11da6","Type":"ContainerStarted","Data":"1606465b6cbdfcaf16de1ec1f876fc2a8a9f5800f4cbfd324e49ebd258aafede"} Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.723471 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtdft" event={"ID":"77dac49a-34a8-421c-99ab-1fa4a08fc6fa","Type":"ContainerStarted","Data":"9741c8f577a59570708dcea60c5a9cd4195f3e3d4963b84c3fd15bbe2a60ad3a"} Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.732469 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" event={"ID":"13cf2230-0cbf-49b1-92f4-b49b8d72acec","Type":"ContainerStarted","Data":"65e3c73bc181b08d4a866bacf4d813187aaa094273fae4c06213566168ee0f82"} Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.756660 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-nqlbc" event={"ID":"e40fc0c2-28f1-444d-8ff4-8f11faa1db2b","Type":"ContainerStarted","Data":"bda8f71c58d373adb9972866e249b5c82d6660f874f3764bccdfe1d4c10f4e5a"} Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.769885 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5ncnf" event={"ID":"c16a9d27-0e19-4e91-ad2b-0a5e3948e19c","Type":"ContainerStarted","Data":"0cdabaf131b51a5d7eb47a9ea7cee5ab9b460d8a7ce0668ee2ab91b99c7001f0"} Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.770153 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5ncnf" event={"ID":"c16a9d27-0e19-4e91-ad2b-0a5e3948e19c","Type":"ContainerStarted","Data":"6a4b894db175a60675ff2d4fdf72dfd93c0de833a831b7bf5c61d47010ca996f"} Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.783754 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:26 crc kubenswrapper[4555]: E1209 15:00:26.784104 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:27.284093044 +0000 UTC m=+137.420805761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.797788 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f6gc" event={"ID":"d3f2ca3d-d5ad-4a22-96d4-3bc4c5eb7a96","Type":"ContainerStarted","Data":"2ed8a744c6b0db6dfd21b92867df9f4b2feca2a4e8b1eb2b72b5b7808d61e9a4"} Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.798781 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" event={"ID":"89fdb2fe-9563-4934-82c8-10c218d81bea","Type":"ContainerStarted","Data":"9b6a9c7b7dcd4eb25a2c8d2d0bbd99100cf80721e10db28a61e1e1e38b26ad45"} Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.801123 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mk586" event={"ID":"b87660e6-af7f-42a9-bdb5-7bee97dab52f","Type":"ContainerStarted","Data":"bed68228f2dc34a0e252983ccf37eefda332ad5a7f30775fa5eae283722e6267"} Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.801144 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mk586" event={"ID":"b87660e6-af7f-42a9-bdb5-7bee97dab52f","Type":"ContainerStarted","Data":"c1ce332fe997da5671ff2f4b446c06d1fccf5df1fa42a998ba77906929573201"} Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.805578 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-whvjt" event={"ID":"bc98cfae-5f65-416a-ba4b-24f601ff6209","Type":"ContainerStarted","Data":"af3922afee7bb66075b0f3bd7453c555dddd93370eda981d7b65916ebd056e56"} Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.811512 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" event={"ID":"b7402585-182c-46a1-a8ac-257c97421c8d","Type":"ContainerStarted","Data":"7678974e7433c04e0241b34eadbaf24c308aedaa0be2b70bc7679381145adda8"} Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.812716 4555 patch_prober.go:28] interesting pod/downloads-7954f5f757-vbr87 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.812744 4555 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vbr87" podUID="f86821ee-dfd8-4090-bd6c-8f624a1362ae" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.846126 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-7drj7"] Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.922224 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6" Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.922904 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-rf8rn" Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.924935 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.926975 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-r9tjc"] Dec 09 15:00:26 crc kubenswrapper[4555]: E1209 15:00:26.927220 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:27.427203989 +0000 UTC m=+137.563916707 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.944259 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzp94"] Dec 09 15:00:26 crc kubenswrapper[4555]: I1209 15:00:26.950763 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-srtkn"] Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.008207 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-rf8rn" podStartSLOduration=115.008188231 podStartE2EDuration="1m55.008188231s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:26.998239107 +0000 UTC m=+137.134951825" watchObservedRunningTime="2025-12-09 15:00:27.008188231 +0000 UTC m=+137.144900949" Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.017812 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mk586" podStartSLOduration=115.017792903 podStartE2EDuration="1m55.017792903s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:27.016743766 +0000 UTC m=+137.153456484" watchObservedRunningTime="2025-12-09 15:00:27.017792903 +0000 UTC m=+137.154505622" Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.028997 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:27 crc kubenswrapper[4555]: E1209 15:00:27.029550 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:27.529291714 +0000 UTC m=+137.666004431 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.134673 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:27 crc kubenswrapper[4555]: E1209 15:00:27.135142 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:27.635131584 +0000 UTC m=+137.771844302 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.139651 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7h6sb" podStartSLOduration=115.139635845 podStartE2EDuration="1m55.139635845s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:27.137781302 +0000 UTC m=+137.274494021" watchObservedRunningTime="2025-12-09 15:00:27.139635845 +0000 UTC m=+137.276348563" Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.206154 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pw6lj"] Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.215439 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rsbss"] Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.236752 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:27 crc kubenswrapper[4555]: E1209 15:00:27.237226 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:27.737208938 +0000 UTC m=+137.873921656 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.238020 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-vbr87" podStartSLOduration=115.238006097 podStartE2EDuration="1m55.238006097s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:27.237505219 +0000 UTC m=+137.374217938" watchObservedRunningTime="2025-12-09 15:00:27.238006097 +0000 UTC m=+137.374718816" Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.242190 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hgkns"] Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.293243 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xbb6"] Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.339073 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:27 crc kubenswrapper[4555]: E1209 15:00:27.339337 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:27.839325306 +0000 UTC m=+137.976038023 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.353324 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-llvq5" podStartSLOduration=115.353294625 podStartE2EDuration="1m55.353294625s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:27.312404425 +0000 UTC m=+137.449117143" watchObservedRunningTime="2025-12-09 15:00:27.353294625 +0000 UTC m=+137.490007343" Dec 09 15:00:27 crc kubenswrapper[4555]: W1209 15:00:27.371208 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode376f745_76d4_42dd_97f7_4554c02eec74.slice/crio-be528cce04188ad429243b853a7d66d67a7924fbf914ccdb8a2b15f4a9fdb797 WatchSource:0}: Error finding container be528cce04188ad429243b853a7d66d67a7924fbf914ccdb8a2b15f4a9fdb797: Status 404 returned error can't find the container with id be528cce04188ad429243b853a7d66d67a7924fbf914ccdb8a2b15f4a9fdb797 Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.403839 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gkckz" podStartSLOduration=115.403825184 podStartE2EDuration="1m55.403825184s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:27.352574951 +0000 UTC m=+137.489287670" watchObservedRunningTime="2025-12-09 15:00:27.403825184 +0000 UTC m=+137.540537902" Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.443589 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:27 crc kubenswrapper[4555]: E1209 15:00:27.443909 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:27.943894369 +0000 UTC m=+138.080607077 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.479478 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-89w6m"] Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.493066 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-skgqs"] Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.543036 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421540-hhxsf"] Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.545715 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:27 crc kubenswrapper[4555]: E1209 15:00:27.546060 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:28.046048639 +0000 UTC m=+138.182761357 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.589089 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6" podStartSLOduration=114.589072019 podStartE2EDuration="1m54.589072019s" podCreationTimestamp="2025-12-09 14:58:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:27.563653794 +0000 UTC m=+137.700366512" watchObservedRunningTime="2025-12-09 15:00:27.589072019 +0000 UTC m=+137.725784737" Dec 09 15:00:27 crc kubenswrapper[4555]: W1209 15:00:27.624611 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3936549f_6055_4b0d_af18_7958d91c672f.slice/crio-1cda284bb770690536cc9f7a11c1c0e55229741608b23da1143bdbb130173e27 WatchSource:0}: Error finding container 1cda284bb770690536cc9f7a11c1c0e55229741608b23da1143bdbb130173e27: Status 404 returned error can't find the container with id 1cda284bb770690536cc9f7a11c1c0e55229741608b23da1143bdbb130173e27 Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.651180 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:27 crc kubenswrapper[4555]: E1209 15:00:27.655897 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:28.155872596 +0000 UTC m=+138.292585314 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.698817 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2wsmx"] Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.706347 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-srlsw"] Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.752625 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:27 crc kubenswrapper[4555]: E1209 15:00:27.753041 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:28.25302956 +0000 UTC m=+138.389742279 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.790124 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gt6wf"] Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.810368 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9l7sb"] Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.812414 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-5ncnf" podStartSLOduration=115.812400274 podStartE2EDuration="1m55.812400274s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:27.801058853 +0000 UTC m=+137.937771571" watchObservedRunningTime="2025-12-09 15:00:27.812400274 +0000 UTC m=+137.949112991" Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.813566 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wt67p"] Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.832740 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7trl6"] Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.841834 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2wsmx" event={"ID":"0445dab5-5a42-4098-9fe0-e38b5650ecd6","Type":"ContainerStarted","Data":"df4f84be51e77ff21efe4b9e2da50feb6eea589683699c99e7c1ef21642b3316"} Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.846540 4555 generic.go:334] "Generic (PLEG): container finished" podID="b7402585-182c-46a1-a8ac-257c97421c8d" containerID="641b2b75db7d5eb3794cb7ce46d822c3653701ab065cd92b7508bd4ac93860e0" exitCode=0 Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.846599 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" event={"ID":"b7402585-182c-46a1-a8ac-257c97421c8d","Type":"ContainerDied","Data":"641b2b75db7d5eb3794cb7ce46d822c3653701ab065cd92b7508bd4ac93860e0"} Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.847098 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4tvxr"] Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.853388 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:27 crc kubenswrapper[4555]: E1209 15:00:27.853531 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:28.35351561 +0000 UTC m=+138.490228329 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.853647 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:27 crc kubenswrapper[4555]: E1209 15:00:27.853925 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:28.353915197 +0000 UTC m=+138.490627915 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.857654 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-hhxsf" event={"ID":"3936549f-6055-4b0d-af18-7958d91c672f","Type":"ContainerStarted","Data":"1cda284bb770690536cc9f7a11c1c0e55229741608b23da1143bdbb130173e27"} Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.868060 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-7drj7" event={"ID":"5926dc72-d5e9-4c49-afba-ac1ce69678be","Type":"ContainerStarted","Data":"22195d13ea424584e4b1264269cb64d02d3eb2139bef1c0d8799da4e94bb50f4"} Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.879033 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-skgqs" event={"ID":"147065ae-e48c-4712-8d72-016b2a692da8","Type":"ContainerStarted","Data":"12640806e00feacbd8b715f7e8ce990544eee577b542ddd32edf7472be5eaf1a"} Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.881230 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9tjc" event={"ID":"d73470df-6b49-4a9f-bcd8-a0acac5cb921","Type":"ContainerStarted","Data":"e05c8700038e9f9b7b58796edb4ea6ceea9817c63e49db115a4979babb7988d0"} Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.882254 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzp94" event={"ID":"b5a1d5a2-358f-4445-9253-5a9bc8b9daf3","Type":"ContainerStarted","Data":"b30fba6d50565aef8e437fd40c2cb7eef7a514afeabf7aa22649512478525250"} Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.882281 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzp94" event={"ID":"b5a1d5a2-358f-4445-9253-5a9bc8b9daf3","Type":"ContainerStarted","Data":"27f3f58831e4d77c0100c0152d568b303a3a645f2f1691d57ccbea0919673c89"} Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.894616 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rsbss" event={"ID":"dc25bf0a-f5e4-4f6d-9442-6b55a043f751","Type":"ContainerStarted","Data":"42fa9342d354691e00279c3fddf4c1fb86a6d2c7b197d88a52aea09357c26251"} Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.895746 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-srtkn" event={"ID":"5446f927-f240-4554-832a-28b950424eea","Type":"ContainerStarted","Data":"fe562077b845926d39cd1378879787eaf475120081559a3cf648e4c4afe28fd3"} Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.898083 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-whvjt" event={"ID":"bc98cfae-5f65-416a-ba4b-24f601ff6209","Type":"ContainerStarted","Data":"2354589223ac712db3c160f9a6f35212e3fdf57559f922dd57e926d9c3acc3f2"} Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.902634 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f6gc" event={"ID":"d3f2ca3d-d5ad-4a22-96d4-3bc4c5eb7a96","Type":"ContainerStarted","Data":"45a7e710bdb28ca70bb13474faaa5d1e007cf4f6525d5bcfeb9349dedd4a2a88"} Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.902663 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f6gc" event={"ID":"d3f2ca3d-d5ad-4a22-96d4-3bc4c5eb7a96","Type":"ContainerStarted","Data":"f81505f2e75e24634f6949f7ba342926246fb2c637f318ddb85e79c24bdb2a17"} Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.904887 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xbb6" event={"ID":"adaadc75-0b7d-4699-8d61-fe5ebb7dc84a","Type":"ContainerStarted","Data":"347a61ead7407bf69b6fa8617be5b0a27ab2b6c4567c14754ea09cfd56e1e6f5"} Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.910180 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vhncm" event={"ID":"a7fc2be6-0cb9-4fad-ac58-37004264a563","Type":"ContainerStarted","Data":"e0e79c2f0a37b71ee10a288e3c2f7510f1b093c170102011ea999c6d4c0198b1"} Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.914937 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pw6lj" event={"ID":"e376f745-76d4-42dd-97f7-4554c02eec74","Type":"ContainerStarted","Data":"be528cce04188ad429243b853a7d66d67a7924fbf914ccdb8a2b15f4a9fdb797"} Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.917950 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-pldb7" event={"ID":"33ad6890-1e95-4a46-9f40-00cf0443d4a1","Type":"ContainerStarted","Data":"59e3d3bd743955b9a09cab1f96aeb936bc4a8ad712de0f09f94870f65fd7dbe3"} Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.939414 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-89bvk"] Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.939803 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8fb58" event={"ID":"7ce81b97-c663-47e5-b75e-ba3333d5cb9a","Type":"ContainerStarted","Data":"6fdfd64e228afbcc04750ffd3c277d9c936de9c0e1205de499381b4670a64a67"} Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.940906 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-msfrj"] Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.954172 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:27 crc kubenswrapper[4555]: E1209 15:00:27.955133 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:28.455115039 +0000 UTC m=+138.591827758 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.959934 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f6gc" podStartSLOduration=115.959916774 podStartE2EDuration="1m55.959916774s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:27.95377491 +0000 UTC m=+138.090487629" watchObservedRunningTime="2025-12-09 15:00:27.959916774 +0000 UTC m=+138.096629481" Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.963881 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-5kplw" event={"ID":"2d15dddf-4a72-4590-9def-755a9fd11da6","Type":"ContainerStarted","Data":"bbe608509dfe5d58ba859241fb6b3793674c244adf69e7fd4f13ddf0d39c28b0"} Dec 09 15:00:27 crc kubenswrapper[4555]: I1209 15:00:27.985620 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b27rg" event={"ID":"6f6a1efb-6c81-45a4-91e2-9f67f8a266be","Type":"ContainerStarted","Data":"5f1e8b93223a8cd0342bbdc7f11561ce35ba00d85438f69419e3be3cf62d07e5"} Dec 09 15:00:28 crc kubenswrapper[4555]: I1209 15:00:28.042035 4555 generic.go:334] "Generic (PLEG): container finished" podID="13cf2230-0cbf-49b1-92f4-b49b8d72acec" containerID="7b2c3a30b3a1a41f656be135bb730d4c34bad109d861c115a2870056b29fdacc" exitCode=0 Dec 09 15:00:28 crc kubenswrapper[4555]: I1209 15:00:28.042105 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" event={"ID":"13cf2230-0cbf-49b1-92f4-b49b8d72acec","Type":"ContainerDied","Data":"7b2c3a30b3a1a41f656be135bb730d4c34bad109d861c115a2870056b29fdacc"} Dec 09 15:00:28 crc kubenswrapper[4555]: I1209 15:00:28.044819 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-whvjt" podStartSLOduration=6.044803166 podStartE2EDuration="6.044803166s" podCreationTimestamp="2025-12-09 15:00:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:28.017120342 +0000 UTC m=+138.153833060" watchObservedRunningTime="2025-12-09 15:00:28.044803166 +0000 UTC m=+138.181515885" Dec 09 15:00:28 crc kubenswrapper[4555]: I1209 15:00:28.080234 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:28 crc kubenswrapper[4555]: E1209 15:00:28.083440 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:28.58342003 +0000 UTC m=+138.720132749 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:28 crc kubenswrapper[4555]: I1209 15:00:28.094247 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-89w6m" event={"ID":"c1b1ecba-4701-4e11-b2e9-1327cc2c5c2d","Type":"ContainerStarted","Data":"520ec668340dc6c1ea4572f74b4d1f949236f94cdeb5fabef1b65177c6a8be38"} Dec 09 15:00:28 crc kubenswrapper[4555]: I1209 15:00:28.116073 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hgkns" event={"ID":"895674c7-2ae3-4c93-8fcf-a2e1b7f2dfcb","Type":"ContainerStarted","Data":"3ae8e159f40bf02478a7be7c19f2fd3626bdb4b43485b7baa07634b6d712f64a"} Dec 09 15:00:28 crc kubenswrapper[4555]: I1209 15:00:28.144319 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzp94" podStartSLOduration=116.144289032 podStartE2EDuration="1m56.144289032s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:28.07987213 +0000 UTC m=+138.216584848" watchObservedRunningTime="2025-12-09 15:00:28.144289032 +0000 UTC m=+138.281001750" Dec 09 15:00:28 crc kubenswrapper[4555]: I1209 15:00:28.144886 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-vhncm" podStartSLOduration=116.144880663 podStartE2EDuration="1m56.144880663s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:28.14343245 +0000 UTC m=+138.280145168" watchObservedRunningTime="2025-12-09 15:00:28.144880663 +0000 UTC m=+138.281593380" Dec 09 15:00:28 crc kubenswrapper[4555]: I1209 15:00:28.145580 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-srlsw" event={"ID":"8c1c0513-5d12-463c-b66d-c1a5d02eafc0","Type":"ContainerStarted","Data":"ef60cc6953dc083a896c67ecc9c9e87895dedaf752c579a448ca1aa93cabd0ac"} Dec 09 15:00:28 crc kubenswrapper[4555]: I1209 15:00:28.153981 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-nqlbc" event={"ID":"e40fc0c2-28f1-444d-8ff4-8f11faa1db2b","Type":"ContainerStarted","Data":"6267f49d058946fadfd7c805277699c0edcb6b1b8f1b2a16ecea8dcf95d4d978"} Dec 09 15:00:28 crc kubenswrapper[4555]: I1209 15:00:28.154182 4555 patch_prober.go:28] interesting pod/downloads-7954f5f757-vbr87 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 09 15:00:28 crc kubenswrapper[4555]: I1209 15:00:28.154214 4555 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vbr87" podUID="f86821ee-dfd8-4090-bd6c-8f624a1362ae" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 09 15:00:28 crc kubenswrapper[4555]: I1209 15:00:28.183032 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:28 crc kubenswrapper[4555]: E1209 15:00:28.184335 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:28.684315647 +0000 UTC m=+138.821028365 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:28 crc kubenswrapper[4555]: I1209 15:00:28.232661 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-pldb7" Dec 09 15:00:28 crc kubenswrapper[4555]: I1209 15:00:28.247566 4555 patch_prober.go:28] interesting pod/router-default-5444994796-pldb7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 15:00:28 crc kubenswrapper[4555]: [-]has-synced failed: reason withheld Dec 09 15:00:28 crc kubenswrapper[4555]: [+]process-running ok Dec 09 15:00:28 crc kubenswrapper[4555]: healthz check failed Dec 09 15:00:28 crc kubenswrapper[4555]: I1209 15:00:28.247639 4555 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pldb7" podUID="33ad6890-1e95-4a46-9f40-00cf0443d4a1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 15:00:28 crc kubenswrapper[4555]: I1209 15:00:28.274823 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8fb58" podStartSLOduration=116.274803258 podStartE2EDuration="1m56.274803258s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:28.214791831 +0000 UTC m=+138.351504539" watchObservedRunningTime="2025-12-09 15:00:28.274803258 +0000 UTC m=+138.411515977" Dec 09 15:00:28 crc kubenswrapper[4555]: I1209 15:00:28.284539 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:28 crc kubenswrapper[4555]: E1209 15:00:28.287352 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:28.787289268 +0000 UTC m=+138.924001986 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:28 crc kubenswrapper[4555]: I1209 15:00:28.297988 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b27rg" podStartSLOduration=115.297970038 podStartE2EDuration="1m55.297970038s" podCreationTimestamp="2025-12-09 14:58:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:28.297149875 +0000 UTC m=+138.433862593" watchObservedRunningTime="2025-12-09 15:00:28.297970038 +0000 UTC m=+138.434682756" Dec 09 15:00:28 crc kubenswrapper[4555]: I1209 15:00:28.352903 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-pldb7" podStartSLOduration=116.352886585 podStartE2EDuration="1m56.352886585s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:28.352565558 +0000 UTC m=+138.489278275" watchObservedRunningTime="2025-12-09 15:00:28.352886585 +0000 UTC m=+138.489599303" Dec 09 15:00:28 crc kubenswrapper[4555]: I1209 15:00:28.391295 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-5kplw" podStartSLOduration=116.39127665 podStartE2EDuration="1m56.39127665s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:28.390691652 +0000 UTC m=+138.527404370" watchObservedRunningTime="2025-12-09 15:00:28.39127665 +0000 UTC m=+138.527989368" Dec 09 15:00:28 crc kubenswrapper[4555]: I1209 15:00:28.391347 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:28 crc kubenswrapper[4555]: E1209 15:00:28.392374 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:28.892354501 +0000 UTC m=+139.029067219 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:28 crc kubenswrapper[4555]: I1209 15:00:28.434746 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-nqlbc" podStartSLOduration=116.434723542 podStartE2EDuration="1m56.434723542s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:28.432548703 +0000 UTC m=+138.569261421" watchObservedRunningTime="2025-12-09 15:00:28.434723542 +0000 UTC m=+138.571436260" Dec 09 15:00:28 crc kubenswrapper[4555]: I1209 15:00:28.493922 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:28 crc kubenswrapper[4555]: E1209 15:00:28.494487 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:28.99447689 +0000 UTC m=+139.131189608 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:28 crc kubenswrapper[4555]: I1209 15:00:28.596110 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:28 crc kubenswrapper[4555]: E1209 15:00:28.596555 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:29.096540429 +0000 UTC m=+139.233253136 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:28 crc kubenswrapper[4555]: I1209 15:00:28.697537 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:28 crc kubenswrapper[4555]: E1209 15:00:28.698076 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:29.198065086 +0000 UTC m=+139.334777803 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:28 crc kubenswrapper[4555]: I1209 15:00:28.798246 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:28 crc kubenswrapper[4555]: E1209 15:00:28.798773 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:29.298752357 +0000 UTC m=+139.435465075 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:28 crc kubenswrapper[4555]: I1209 15:00:28.901689 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:28 crc kubenswrapper[4555]: E1209 15:00:28.902030 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:29.402013482 +0000 UTC m=+139.538726201 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.002502 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:29 crc kubenswrapper[4555]: E1209 15:00:29.003048 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:29.503035128 +0000 UTC m=+139.639747845 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.103856 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:29 crc kubenswrapper[4555]: E1209 15:00:29.104210 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:29.604198529 +0000 UTC m=+139.740911248 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.182396 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9l7sb" event={"ID":"2267e449-66c1-4699-897b-9851e54d2645","Type":"ContainerStarted","Data":"6ba1f85dc322169097b38e8633251f32385d8248140a9912bdda67719cea5e6d"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.182614 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9l7sb" event={"ID":"2267e449-66c1-4699-897b-9851e54d2645","Type":"ContainerStarted","Data":"5a1a6ca21ad9bbbb80ee2010af806f0ca5d0fb2085659ad7a82a5487bef7b135"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.189774 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-7drj7" event={"ID":"5926dc72-d5e9-4c49-afba-ac1ce69678be","Type":"ContainerStarted","Data":"a130d532e5d2266809ef8bbc43036fb889f8f78654ffe8b0d8325fdcf1b0b651"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.190349 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-7drj7" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.203355 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jxl9b" event={"ID":"0a4f7efd-8434-48a7-979e-9f128e93797b","Type":"ContainerStarted","Data":"87ed3b03d2101181b59bc929394d48c1091a588443c1b2a1f795505ad0e78ad1"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.203391 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jxl9b" event={"ID":"0a4f7efd-8434-48a7-979e-9f128e93797b","Type":"ContainerStarted","Data":"1101d705ece3118e79f992ce7466f26ba014a0b8b755944729f9392d85a893a0"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.206795 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:29 crc kubenswrapper[4555]: E1209 15:00:29.207243 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:29.707229801 +0000 UTC m=+139.843942519 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.207439 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:29 crc kubenswrapper[4555]: E1209 15:00:29.208666 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:29.708652856 +0000 UTC m=+139.845365574 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.219087 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-7drj7" podStartSLOduration=117.219073313 podStartE2EDuration="1m57.219073313s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:29.218153211 +0000 UTC m=+139.354865929" watchObservedRunningTime="2025-12-09 15:00:29.219073313 +0000 UTC m=+139.355786032" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.228021 4555 patch_prober.go:28] interesting pod/router-default-5444994796-pldb7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 15:00:29 crc kubenswrapper[4555]: [-]has-synced failed: reason withheld Dec 09 15:00:29 crc kubenswrapper[4555]: [+]process-running ok Dec 09 15:00:29 crc kubenswrapper[4555]: healthz check failed Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.228049 4555 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pldb7" podUID="33ad6890-1e95-4a46-9f40-00cf0443d4a1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.229994 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hgkns" event={"ID":"895674c7-2ae3-4c93-8fcf-a2e1b7f2dfcb","Type":"ContainerStarted","Data":"cbe8d303590b11d2b391752bdda923700de7ec0912d512cc3819fee658cb5e49"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.230029 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hgkns" event={"ID":"895674c7-2ae3-4c93-8fcf-a2e1b7f2dfcb","Type":"ContainerStarted","Data":"15ca97ed8c8b5c6f9a92c20f10c248d6bd666f41b94855dd988334eb8ec8076e"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.259751 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wt67p" event={"ID":"468e155a-66ab-4823-b737-7e1f9c78ac3f","Type":"ContainerStarted","Data":"abce3a6add912a65fac86e53758f5ed7129a7cf8fbbe499f914c00c42cb67d35"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.259789 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wt67p" event={"ID":"468e155a-66ab-4823-b737-7e1f9c78ac3f","Type":"ContainerStarted","Data":"6ad593483a575c27403a3ae7db1b48d47da9f202e8e5748f01e9a9c5f7d6ef4c"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.259799 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wt67p" event={"ID":"468e155a-66ab-4823-b737-7e1f9c78ac3f","Type":"ContainerStarted","Data":"ded494e4a2e4388a1941c54821668b967454235eca1a957e1328c28c24fdf7a0"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.260388 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wt67p" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.270294 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hgkns" podStartSLOduration=117.270284062 podStartE2EDuration="1m57.270284062s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:29.269961741 +0000 UTC m=+139.406674460" watchObservedRunningTime="2025-12-09 15:00:29.270284062 +0000 UTC m=+139.406996780" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.271136 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jxl9b" podStartSLOduration=117.271128822 podStartE2EDuration="1m57.271128822s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:29.248532232 +0000 UTC m=+139.385244950" watchObservedRunningTime="2025-12-09 15:00:29.271128822 +0000 UTC m=+139.407841540" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.280534 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" event={"ID":"89fdb2fe-9563-4934-82c8-10c218d81bea","Type":"ContainerStarted","Data":"91d9afced0c772ad319a8a18e08508bc11e29dff86ec3c7b921e59b3582aa84d"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.280988 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.308058 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7trl6" event={"ID":"d98a1c63-4a49-437d-97a3-c9c37efce476","Type":"ContainerStarted","Data":"4cc2cb13575e85c441522844e64e450f6a6bff03a0c9865c2d75e8b464824dc2"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.308098 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7trl6" event={"ID":"d98a1c63-4a49-437d-97a3-c9c37efce476","Type":"ContainerStarted","Data":"67d21b1f36e60deb09a19821db8deaba47334a0a3fd9b8eaa0a828bac5eb403d"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.308272 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.308698 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7trl6" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.308799 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wt67p" podStartSLOduration=117.308785076 podStartE2EDuration="1m57.308785076s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:29.307809048 +0000 UTC m=+139.444521756" watchObservedRunningTime="2025-12-09 15:00:29.308785076 +0000 UTC m=+139.445497794" Dec 09 15:00:29 crc kubenswrapper[4555]: E1209 15:00:29.309523 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:29.80949423 +0000 UTC m=+139.946206948 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.315078 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-srtkn" event={"ID":"5446f927-f240-4554-832a-28b950424eea","Type":"ContainerStarted","Data":"ce33c2baef561e310f8a342b60e6fdd4fcaa220a198022caa2cb89d2549bc9a3"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.315123 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-srtkn" event={"ID":"5446f927-f240-4554-832a-28b950424eea","Type":"ContainerStarted","Data":"ab3069bcee3f99adc0fa94bfef2585169fdc75877162a4802bfea7428db8a348"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.317535 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gt6wf" event={"ID":"a77db5eb-461b-49d7-a2d7-91be5bb52a09","Type":"ContainerStarted","Data":"6b815ff955462ade102199689187bde9db440509bf890fe2aa5db2f1539bad8e"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.317561 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gt6wf" event={"ID":"a77db5eb-461b-49d7-a2d7-91be5bb52a09","Type":"ContainerStarted","Data":"2542f6bca89ea5da4166d83cad5036a6a4f826db5b0e65828302fd761ab42b61"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.318068 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gt6wf" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.318831 4555 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-gt6wf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.318872 4555 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gt6wf" podUID="a77db5eb-461b-49d7-a2d7-91be5bb52a09" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.320814 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-89w6m" event={"ID":"c1b1ecba-4701-4e11-b2e9-1327cc2c5c2d","Type":"ContainerStarted","Data":"d438bc5bc947a0c05af37bdbb31489dfe202c851569711d2c03333393c0e8d4b"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.322604 4555 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-7trl6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" start-of-body= Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.322632 4555 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7trl6" podUID="d98a1c63-4a49-437d-97a3-c9c37efce476" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.324218 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-89bvk" event={"ID":"dd026b3b-b5df-4f5a-81c8-2bf0b6b8c766","Type":"ContainerStarted","Data":"fd082f1be9e2c812f14e63a0c1291cbc20a996234fe0189d387f6fcf89b08672"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.324248 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-89bvk" event={"ID":"dd026b3b-b5df-4f5a-81c8-2bf0b6b8c766","Type":"ContainerStarted","Data":"d1d1e88bff0b8f49fcb4a72fc5fc4a47f6a606b486b065519d5bd52315875355"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.324259 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-89bvk" event={"ID":"dd026b3b-b5df-4f5a-81c8-2bf0b6b8c766","Type":"ContainerStarted","Data":"8dc4eb79f8b2c93f8b10e0e0c7343bc36a1008e2f814bf746c27ed528e112867"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.327176 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xbb6" event={"ID":"adaadc75-0b7d-4699-8d61-fe5ebb7dc84a","Type":"ContainerStarted","Data":"b2214651a71db6dfb745b5598eaee2da536ec59deec1aded27d6d056ee224bda"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.330718 4555 generic.go:334] "Generic (PLEG): container finished" podID="77dac49a-34a8-421c-99ab-1fa4a08fc6fa" containerID="b3cead1747e9d2b8da84392764e106b940f10cd4254f3b834917591081f62886" exitCode=0 Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.330956 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtdft" event={"ID":"77dac49a-34a8-421c-99ab-1fa4a08fc6fa","Type":"ContainerDied","Data":"b3cead1747e9d2b8da84392764e106b940f10cd4254f3b834917591081f62886"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.344590 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" podStartSLOduration=117.344573302 podStartE2EDuration="1m57.344573302s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:29.336793526 +0000 UTC m=+139.473506245" watchObservedRunningTime="2025-12-09 15:00:29.344573302 +0000 UTC m=+139.481286020" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.365594 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-skgqs" event={"ID":"147065ae-e48c-4712-8d72-016b2a692da8","Type":"ContainerStarted","Data":"e33ef89af12e37949b8daed8e2fe690e82a3784edcc7cc7e7820186089f6291a"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.366107 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-89w6m" podStartSLOduration=7.366092322 podStartE2EDuration="7.366092322s" podCreationTimestamp="2025-12-09 15:00:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:29.358717152 +0000 UTC m=+139.495429870" watchObservedRunningTime="2025-12-09 15:00:29.366092322 +0000 UTC m=+139.502805040" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.366907 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-skgqs" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.383583 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-skgqs" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.412925 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:29 crc kubenswrapper[4555]: E1209 15:00:29.414754 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:29.914740645 +0000 UTC m=+140.051453364 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.423730 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2wsmx" event={"ID":"0445dab5-5a42-4098-9fe0-e38b5650ecd6","Type":"ContainerStarted","Data":"3d3627937e58cf6c2f0cbc59d7fa0a53a1093dacdd830f616da8a993390a4487"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.425768 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7trl6" podStartSLOduration=117.425743177 podStartE2EDuration="1m57.425743177s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:29.423778294 +0000 UTC m=+139.560491012" watchObservedRunningTime="2025-12-09 15:00:29.425743177 +0000 UTC m=+139.562455894" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.426659 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-srtkn" podStartSLOduration=117.426645535 podStartE2EDuration="1m57.426645535s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:29.399845313 +0000 UTC m=+139.536558031" watchObservedRunningTime="2025-12-09 15:00:29.426645535 +0000 UTC m=+139.563358254" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.459438 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-msfrj" event={"ID":"1908bc6b-523d-4ffd-8e4c-e73f3e3b7ee3","Type":"ContainerStarted","Data":"d059f155b5d51d290f9b3c2bec03cc3879174f7327917941b9a9c6c9b700ef84"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.459492 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-msfrj" event={"ID":"1908bc6b-523d-4ffd-8e4c-e73f3e3b7ee3","Type":"ContainerStarted","Data":"0f9e59c3f74383a6a4b825dee95e54f4f546f52ecde543552dd28c7ef5e823b9"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.460350 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xbb6" podStartSLOduration=117.460318384 podStartE2EDuration="1m57.460318384s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:29.459761821 +0000 UTC m=+139.596474528" watchObservedRunningTime="2025-12-09 15:00:29.460318384 +0000 UTC m=+139.597031102" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.479016 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gt6wf" podStartSLOduration=117.478989249 podStartE2EDuration="1m57.478989249s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:29.477493816 +0000 UTC m=+139.614206534" watchObservedRunningTime="2025-12-09 15:00:29.478989249 +0000 UTC m=+139.615701967" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.481727 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" event={"ID":"b7402585-182c-46a1-a8ac-257c97421c8d","Type":"ContainerStarted","Data":"720a181c888958a7a8eafa44c7a7b2b35ec4bdc3a8deb1baabe001d4758b6f55"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.495056 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pw6lj" event={"ID":"e376f745-76d4-42dd-97f7-4554c02eec74","Type":"ContainerStarted","Data":"c7b8876021e430797c3f990bf286d96c6b0e1094f79da1cb3f7dec834f0ce8bf"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.515549 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.517057 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4tvxr" event={"ID":"c03f49e6-2884-451b-af8c-9d47555d1413","Type":"ContainerStarted","Data":"c50f2068a9c2f245e8e756b19fd0713c463f7e059ea58f3e114c70b3519126b1"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.517094 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4tvxr" event={"ID":"c03f49e6-2884-451b-af8c-9d47555d1413","Type":"ContainerStarted","Data":"f12c519cb27e4dc4860eabfc76aea9d0822ecce6e8690bd43da5b20af76f43ab"} Dec 09 15:00:29 crc kubenswrapper[4555]: E1209 15:00:29.518339 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:30.018293085 +0000 UTC m=+140.155005803 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.526139 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-hhxsf" event={"ID":"3936549f-6055-4b0d-af18-7958d91c672f","Type":"ContainerStarted","Data":"5be729e8f1858bddb4d242a3062488ed3f6e1b540192989da0f33e818cea5809"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.537686 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-89bvk" podStartSLOduration=117.537673102 podStartE2EDuration="1m57.537673102s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:29.535638529 +0000 UTC m=+139.672351246" watchObservedRunningTime="2025-12-09 15:00:29.537673102 +0000 UTC m=+139.674385819" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.540752 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-srlsw" event={"ID":"8c1c0513-5d12-463c-b66d-c1a5d02eafc0","Type":"ContainerStarted","Data":"703a32908d95de30c0a23d8b5ccf34acd82aa5d32fd58b62abf083a63d3aec9f"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.541511 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-srlsw" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.548862 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rsbss" event={"ID":"dc25bf0a-f5e4-4f6d-9442-6b55a043f751","Type":"ContainerStarted","Data":"0bf880ccf452f01159543c1cf96343ce7096881e9b3ee6c2d5c4d2adfc738506"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.554445 4555 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-srlsw container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.554505 4555 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-srlsw" podUID="8c1c0513-5d12-463c-b66d-c1a5d02eafc0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.559194 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" event={"ID":"13cf2230-0cbf-49b1-92f4-b49b8d72acec","Type":"ContainerStarted","Data":"686032c10fd9a98b0ab87070314d879ced00ea3db7e04652a9f7521fc971c8e0"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.578336 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-4tvxr" podStartSLOduration=116.578324399 podStartE2EDuration="1m56.578324399s" podCreationTimestamp="2025-12-09 14:58:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:29.553479149 +0000 UTC m=+139.690191868" watchObservedRunningTime="2025-12-09 15:00:29.578324399 +0000 UTC m=+139.715037117" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.584871 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9tjc" event={"ID":"d73470df-6b49-4a9f-bcd8-a0acac5cb921","Type":"ContainerStarted","Data":"31e2fbcf9a182331a73454d5cb0dd1bb16824902b687aa9d217c22d743036772"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.584902 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9tjc" event={"ID":"d73470df-6b49-4a9f-bcd8-a0acac5cb921","Type":"ContainerStarted","Data":"4d85e8a09b6f8ddd812b7ef0842c6a4fbb0d426dfe99b5512d46be5a57b9e77e"} Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.600261 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pw6lj" podStartSLOduration=117.60025134 podStartE2EDuration="1m57.60025134s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:29.579764566 +0000 UTC m=+139.716477285" watchObservedRunningTime="2025-12-09 15:00:29.60025134 +0000 UTC m=+139.736964059" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.605733 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-hhxsf" podStartSLOduration=29.60572625 podStartE2EDuration="29.60572625s" podCreationTimestamp="2025-12-09 15:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:29.597269763 +0000 UTC m=+139.733982481" watchObservedRunningTime="2025-12-09 15:00:29.60572625 +0000 UTC m=+139.742438969" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.616921 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:29 crc kubenswrapper[4555]: E1209 15:00:29.618160 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:30.118130765 +0000 UTC m=+140.254843484 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.639406 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" podStartSLOduration=117.639382558 podStartE2EDuration="1m57.639382558s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:29.618081342 +0000 UTC m=+139.754794060" watchObservedRunningTime="2025-12-09 15:00:29.639382558 +0000 UTC m=+139.776095277" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.674125 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-skgqs" podStartSLOduration=117.674107771 podStartE2EDuration="1m57.674107771s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:29.638775619 +0000 UTC m=+139.775488327" watchObservedRunningTime="2025-12-09 15:00:29.674107771 +0000 UTC m=+139.810820490" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.695423 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-srlsw" podStartSLOduration=116.695406373 podStartE2EDuration="1m56.695406373s" podCreationTimestamp="2025-12-09 14:58:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:29.693290715 +0000 UTC m=+139.830003434" watchObservedRunningTime="2025-12-09 15:00:29.695406373 +0000 UTC m=+139.832119092" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.696402 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" podStartSLOduration=116.696391088 podStartE2EDuration="1m56.696391088s" podCreationTimestamp="2025-12-09 14:58:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:29.663697914 +0000 UTC m=+139.800410632" watchObservedRunningTime="2025-12-09 15:00:29.696391088 +0000 UTC m=+139.833103806" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.712776 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rsbss" podStartSLOduration=117.712763179 podStartE2EDuration="1m57.712763179s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:29.711503663 +0000 UTC m=+139.848216381" watchObservedRunningTime="2025-12-09 15:00:29.712763179 +0000 UTC m=+139.849475896" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.721625 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.722146 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:29 crc kubenswrapper[4555]: E1209 15:00:29.723784 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:30.223770888 +0000 UTC m=+140.360483606 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.788905 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9tjc" podStartSLOduration=117.78888502 podStartE2EDuration="1m57.78888502s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:29.755160873 +0000 UTC m=+139.891873591" watchObservedRunningTime="2025-12-09 15:00:29.78888502 +0000 UTC m=+139.925597738" Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.826576 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:29 crc kubenswrapper[4555]: E1209 15:00:29.826953 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:30.326943396 +0000 UTC m=+140.463656114 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.927715 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:29 crc kubenswrapper[4555]: E1209 15:00:29.927906 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:30.427880892 +0000 UTC m=+140.564593610 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:29 crc kubenswrapper[4555]: I1209 15:00:29.927989 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:29 crc kubenswrapper[4555]: E1209 15:00:29.928225 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:30.428213722 +0000 UTC m=+140.564926440 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.029737 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:30 crc kubenswrapper[4555]: E1209 15:00:30.029884 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:30.529863155 +0000 UTC m=+140.666575873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.030140 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:30 crc kubenswrapper[4555]: E1209 15:00:30.030392 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:30.530384532 +0000 UTC m=+140.667097251 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.130866 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:30 crc kubenswrapper[4555]: E1209 15:00:30.131399 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:30.631385569 +0000 UTC m=+140.768098286 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.190477 4555 patch_prober.go:28] interesting pod/console-operator-58897d9998-7drj7 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.190523 4555 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-7drj7" podUID="5926dc72-d5e9-4c49-afba-ac1ce69678be" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.228823 4555 patch_prober.go:28] interesting pod/router-default-5444994796-pldb7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 15:00:30 crc kubenswrapper[4555]: [-]has-synced failed: reason withheld Dec 09 15:00:30 crc kubenswrapper[4555]: [+]process-running ok Dec 09 15:00:30 crc kubenswrapper[4555]: healthz check failed Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.228869 4555 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pldb7" podUID="33ad6890-1e95-4a46-9f40-00cf0443d4a1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.232570 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:30 crc kubenswrapper[4555]: E1209 15:00:30.232792 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:30.732783546 +0000 UTC m=+140.869496263 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.333584 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:30 crc kubenswrapper[4555]: E1209 15:00:30.333743 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:30.833719147 +0000 UTC m=+140.970431865 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.333935 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:30 crc kubenswrapper[4555]: E1209 15:00:30.334206 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:30.834195489 +0000 UTC m=+140.970908208 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.435331 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:30 crc kubenswrapper[4555]: E1209 15:00:30.435532 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:30.935509236 +0000 UTC m=+141.072221954 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.435804 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:30 crc kubenswrapper[4555]: E1209 15:00:30.436101 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:30.936091529 +0000 UTC m=+141.072804238 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.483590 4555 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.537039 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:30 crc kubenswrapper[4555]: E1209 15:00:30.537423 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:31.037409074 +0000 UTC m=+141.174121792 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.590994 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" event={"ID":"b7402585-182c-46a1-a8ac-257c97421c8d","Type":"ContainerStarted","Data":"f4a9da2f3fa3b91be0653527ffa52e4158f1e72ff3618775ea99fcbeb48bf450"} Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.593138 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9l7sb" event={"ID":"2267e449-66c1-4699-897b-9851e54d2645","Type":"ContainerStarted","Data":"c05a84b6e1ad2ccab09934ac9686714c15b15b91eb9c10df2425f6d460624c68"} Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.593274 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-9l7sb" Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.594586 4555 generic.go:334] "Generic (PLEG): container finished" podID="3936549f-6055-4b0d-af18-7958d91c672f" containerID="5be729e8f1858bddb4d242a3062488ed3f6e1b540192989da0f33e818cea5809" exitCode=0 Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.594653 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-hhxsf" event={"ID":"3936549f-6055-4b0d-af18-7958d91c672f","Type":"ContainerDied","Data":"5be729e8f1858bddb4d242a3062488ed3f6e1b540192989da0f33e818cea5809"} Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.597115 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtdft" event={"ID":"77dac49a-34a8-421c-99ab-1fa4a08fc6fa","Type":"ContainerStarted","Data":"47b6aaa2de5117d872d419f3ed1f684aabce184fa5f5f55e70b8230efe387b6b"} Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.597228 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtdft" Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.598965 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-msfrj" event={"ID":"1908bc6b-523d-4ffd-8e4c-e73f3e3b7ee3","Type":"ContainerStarted","Data":"a22f85933d564a1fea55b2296004aab5f045d02ff5d72e44d3f1b0fe562c99c5"} Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.601488 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2wsmx" event={"ID":"0445dab5-5a42-4098-9fe0-e38b5650ecd6","Type":"ContainerStarted","Data":"546ed32ee7d7cd5f56b6bad6f300a0231f5451763307cb8949071cd58fe85acb"} Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.601516 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2wsmx" event={"ID":"0445dab5-5a42-4098-9fe0-e38b5650ecd6","Type":"ContainerStarted","Data":"71b76424181268a1e530c25822305a401d9a13fb426842939643fe880348e7cb"} Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.602105 4555 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-srlsw container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.602152 4555 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-srlsw" podUID="8c1c0513-5d12-463c-b66d-c1a5d02eafc0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.609837 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gt6wf" Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.610179 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7trl6" Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.612534 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-7drj7" Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.635418 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-9l7sb" podStartSLOduration=7.635404147 podStartE2EDuration="7.635404147s" podCreationTimestamp="2025-12-09 15:00:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:30.621591975 +0000 UTC m=+140.758304693" watchObservedRunningTime="2025-12-09 15:00:30.635404147 +0000 UTC m=+140.772116864" Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.638794 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:30 crc kubenswrapper[4555]: E1209 15:00:30.639191 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:31.139179607 +0000 UTC m=+141.275892326 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.668789 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtdft" podStartSLOduration=118.668773541 podStartE2EDuration="1m58.668773541s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:30.667051199 +0000 UTC m=+140.803763917" watchObservedRunningTime="2025-12-09 15:00:30.668773541 +0000 UTC m=+140.805486259" Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.729862 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-msfrj" podStartSLOduration=118.729845176 podStartE2EDuration="1m58.729845176s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:30.729186108 +0000 UTC m=+140.865898826" watchObservedRunningTime="2025-12-09 15:00:30.729845176 +0000 UTC m=+140.866557895" Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.740004 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.740171 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.740417 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:30 crc kubenswrapper[4555]: E1209 15:00:30.741706 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:31.241691494 +0000 UTC m=+141.378404212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.759510 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.759774 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.760933 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.842936 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:30 crc kubenswrapper[4555]: E1209 15:00:30.843248 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:31.343235168 +0000 UTC m=+141.479947886 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.853426 4555 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xdwlv container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 09 15:00:30 crc kubenswrapper[4555]: [+]log ok Dec 09 15:00:30 crc kubenswrapper[4555]: [+]etcd ok Dec 09 15:00:30 crc kubenswrapper[4555]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 09 15:00:30 crc kubenswrapper[4555]: [+]poststarthook/generic-apiserver-start-informers ok Dec 09 15:00:30 crc kubenswrapper[4555]: [+]poststarthook/max-in-flight-filter ok Dec 09 15:00:30 crc kubenswrapper[4555]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 09 15:00:30 crc kubenswrapper[4555]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 09 15:00:30 crc kubenswrapper[4555]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 09 15:00:30 crc kubenswrapper[4555]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 09 15:00:30 crc kubenswrapper[4555]: [-]poststarthook/project.openshift.io-projectcache failed: reason withheld Dec 09 15:00:30 crc kubenswrapper[4555]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 09 15:00:30 crc kubenswrapper[4555]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Dec 09 15:00:30 crc kubenswrapper[4555]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 09 15:00:30 crc kubenswrapper[4555]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 09 15:00:30 crc kubenswrapper[4555]: livez check failed Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.853930 4555 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" podUID="b7402585-182c-46a1-a8ac-257c97421c8d" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 15:00:30 crc kubenswrapper[4555]: I1209 15:00:30.943785 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:30 crc kubenswrapper[4555]: E1209 15:00:30.944099 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:31.444084716 +0000 UTC m=+141.580797434 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.045018 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:31 crc kubenswrapper[4555]: E1209 15:00:31.045369 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:31.545355452 +0000 UTC m=+141.682068171 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.146606 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:31 crc kubenswrapper[4555]: E1209 15:00:31.146776 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:31.646754501 +0000 UTC m=+141.783467220 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.146816 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:31 crc kubenswrapper[4555]: E1209 15:00:31.147042 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:31.64703325 +0000 UTC m=+141.783745968 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.225125 4555 patch_prober.go:28] interesting pod/router-default-5444994796-pldb7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 15:00:31 crc kubenswrapper[4555]: [-]has-synced failed: reason withheld Dec 09 15:00:31 crc kubenswrapper[4555]: [+]process-running ok Dec 09 15:00:31 crc kubenswrapper[4555]: healthz check failed Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.225184 4555 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pldb7" podUID="33ad6890-1e95-4a46-9f40-00cf0443d4a1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.248012 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:31 crc kubenswrapper[4555]: E1209 15:00:31.248324 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:31.748254192 +0000 UTC m=+141.884966910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.248418 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:31 crc kubenswrapper[4555]: E1209 15:00:31.248742 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:31.74872327 +0000 UTC m=+141.885435987 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wd6tn" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.286834 4555 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-09T15:00:30.483613Z","Handler":null,"Name":""} Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.296886 4555 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.296924 4555 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.350211 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.353448 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.381450 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bdb9r"] Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.382186 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bdb9r" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.383619 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.390131 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bdb9r"] Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.451662 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.454195 4555 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.454233 4555 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.475970 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wd6tn\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.493177 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.553251 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd5cca88-9272-46b2-89f1-b0c2b44e2a9f-catalog-content\") pod \"certified-operators-bdb9r\" (UID: \"fd5cca88-9272-46b2-89f1-b0c2b44e2a9f\") " pod="openshift-marketplace/certified-operators-bdb9r" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.553549 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd5cca88-9272-46b2-89f1-b0c2b44e2a9f-utilities\") pod \"certified-operators-bdb9r\" (UID: \"fd5cca88-9272-46b2-89f1-b0c2b44e2a9f\") " pod="openshift-marketplace/certified-operators-bdb9r" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.553594 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drswg\" (UniqueName: \"kubernetes.io/projected/fd5cca88-9272-46b2-89f1-b0c2b44e2a9f-kube-api-access-drswg\") pod \"certified-operators-bdb9r\" (UID: \"fd5cca88-9272-46b2-89f1-b0c2b44e2a9f\") " pod="openshift-marketplace/certified-operators-bdb9r" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.582421 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mc787"] Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.583341 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mc787" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.585479 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.603244 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mc787"] Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.622348 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.629317 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2wsmx" event={"ID":"0445dab5-5a42-4098-9fe0-e38b5650ecd6","Type":"ContainerStarted","Data":"09023badd4016e1b40bb6a8cb776e537add6b4cfcc60d674511c16d8305d15bf"} Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.632106 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-srlsw" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.634259 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5xggk" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.655326 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd5cca88-9272-46b2-89f1-b0c2b44e2a9f-catalog-content\") pod \"certified-operators-bdb9r\" (UID: \"fd5cca88-9272-46b2-89f1-b0c2b44e2a9f\") " pod="openshift-marketplace/certified-operators-bdb9r" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.655387 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd5cca88-9272-46b2-89f1-b0c2b44e2a9f-utilities\") pod \"certified-operators-bdb9r\" (UID: \"fd5cca88-9272-46b2-89f1-b0c2b44e2a9f\") " pod="openshift-marketplace/certified-operators-bdb9r" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.655434 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drswg\" (UniqueName: \"kubernetes.io/projected/fd5cca88-9272-46b2-89f1-b0c2b44e2a9f-kube-api-access-drswg\") pod \"certified-operators-bdb9r\" (UID: \"fd5cca88-9272-46b2-89f1-b0c2b44e2a9f\") " pod="openshift-marketplace/certified-operators-bdb9r" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.655896 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wd6tn"] Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.656562 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd5cca88-9272-46b2-89f1-b0c2b44e2a9f-utilities\") pod \"certified-operators-bdb9r\" (UID: \"fd5cca88-9272-46b2-89f1-b0c2b44e2a9f\") " pod="openshift-marketplace/certified-operators-bdb9r" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.656873 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd5cca88-9272-46b2-89f1-b0c2b44e2a9f-catalog-content\") pod \"certified-operators-bdb9r\" (UID: \"fd5cca88-9272-46b2-89f1-b0c2b44e2a9f\") " pod="openshift-marketplace/certified-operators-bdb9r" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.665103 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-2wsmx" podStartSLOduration=8.665083515 podStartE2EDuration="8.665083515s" podCreationTimestamp="2025-12-09 15:00:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:31.664575583 +0000 UTC m=+141.801288301" watchObservedRunningTime="2025-12-09 15:00:31.665083515 +0000 UTC m=+141.801796234" Dec 09 15:00:31 crc kubenswrapper[4555]: W1209 15:00:31.680982 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeee41022_2a9c_41c8_bbc8_ff42874c4050.slice/crio-7ed3ad04539c4041ff4a4e69a1a26dc71100e7c02fe83bef54acdffc8de5e5ae WatchSource:0}: Error finding container 7ed3ad04539c4041ff4a4e69a1a26dc71100e7c02fe83bef54acdffc8de5e5ae: Status 404 returned error can't find the container with id 7ed3ad04539c4041ff4a4e69a1a26dc71100e7c02fe83bef54acdffc8de5e5ae Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.693058 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drswg\" (UniqueName: \"kubernetes.io/projected/fd5cca88-9272-46b2-89f1-b0c2b44e2a9f-kube-api-access-drswg\") pod \"certified-operators-bdb9r\" (UID: \"fd5cca88-9272-46b2-89f1-b0c2b44e2a9f\") " pod="openshift-marketplace/certified-operators-bdb9r" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.694075 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bdb9r" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.762508 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14dca34a-64a8-4190-aa9a-07c5cb2bdf33-utilities\") pod \"community-operators-mc787\" (UID: \"14dca34a-64a8-4190-aa9a-07c5cb2bdf33\") " pod="openshift-marketplace/community-operators-mc787" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.763770 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmvwx\" (UniqueName: \"kubernetes.io/projected/14dca34a-64a8-4190-aa9a-07c5cb2bdf33-kube-api-access-cmvwx\") pod \"community-operators-mc787\" (UID: \"14dca34a-64a8-4190-aa9a-07c5cb2bdf33\") " pod="openshift-marketplace/community-operators-mc787" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.763902 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14dca34a-64a8-4190-aa9a-07c5cb2bdf33-catalog-content\") pod \"community-operators-mc787\" (UID: \"14dca34a-64a8-4190-aa9a-07c5cb2bdf33\") " pod="openshift-marketplace/community-operators-mc787" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.811266 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nnzzh"] Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.812085 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nnzzh" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.822871 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nnzzh"] Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.866007 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ee87b32-b776-445a-b3f5-d94c87430742-utilities\") pod \"certified-operators-nnzzh\" (UID: \"9ee87b32-b776-445a-b3f5-d94c87430742\") " pod="openshift-marketplace/certified-operators-nnzzh" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.866043 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ee87b32-b776-445a-b3f5-d94c87430742-catalog-content\") pod \"certified-operators-nnzzh\" (UID: \"9ee87b32-b776-445a-b3f5-d94c87430742\") " pod="openshift-marketplace/certified-operators-nnzzh" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.866083 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pc6m\" (UniqueName: \"kubernetes.io/projected/9ee87b32-b776-445a-b3f5-d94c87430742-kube-api-access-9pc6m\") pod \"certified-operators-nnzzh\" (UID: \"9ee87b32-b776-445a-b3f5-d94c87430742\") " pod="openshift-marketplace/certified-operators-nnzzh" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.866109 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14dca34a-64a8-4190-aa9a-07c5cb2bdf33-catalog-content\") pod \"community-operators-mc787\" (UID: \"14dca34a-64a8-4190-aa9a-07c5cb2bdf33\") " pod="openshift-marketplace/community-operators-mc787" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.866164 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14dca34a-64a8-4190-aa9a-07c5cb2bdf33-utilities\") pod \"community-operators-mc787\" (UID: \"14dca34a-64a8-4190-aa9a-07c5cb2bdf33\") " pod="openshift-marketplace/community-operators-mc787" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.866243 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmvwx\" (UniqueName: \"kubernetes.io/projected/14dca34a-64a8-4190-aa9a-07c5cb2bdf33-kube-api-access-cmvwx\") pod \"community-operators-mc787\" (UID: \"14dca34a-64a8-4190-aa9a-07c5cb2bdf33\") " pod="openshift-marketplace/community-operators-mc787" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.866653 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14dca34a-64a8-4190-aa9a-07c5cb2bdf33-utilities\") pod \"community-operators-mc787\" (UID: \"14dca34a-64a8-4190-aa9a-07c5cb2bdf33\") " pod="openshift-marketplace/community-operators-mc787" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.866685 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14dca34a-64a8-4190-aa9a-07c5cb2bdf33-catalog-content\") pod \"community-operators-mc787\" (UID: \"14dca34a-64a8-4190-aa9a-07c5cb2bdf33\") " pod="openshift-marketplace/community-operators-mc787" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.873561 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-hhxsf" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.887907 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmvwx\" (UniqueName: \"kubernetes.io/projected/14dca34a-64a8-4190-aa9a-07c5cb2bdf33-kube-api-access-cmvwx\") pod \"community-operators-mc787\" (UID: \"14dca34a-64a8-4190-aa9a-07c5cb2bdf33\") " pod="openshift-marketplace/community-operators-mc787" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.918004 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mc787" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.966828 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbhqh\" (UniqueName: \"kubernetes.io/projected/3936549f-6055-4b0d-af18-7958d91c672f-kube-api-access-wbhqh\") pod \"3936549f-6055-4b0d-af18-7958d91c672f\" (UID: \"3936549f-6055-4b0d-af18-7958d91c672f\") " Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.966906 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3936549f-6055-4b0d-af18-7958d91c672f-config-volume\") pod \"3936549f-6055-4b0d-af18-7958d91c672f\" (UID: \"3936549f-6055-4b0d-af18-7958d91c672f\") " Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.966980 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3936549f-6055-4b0d-af18-7958d91c672f-secret-volume\") pod \"3936549f-6055-4b0d-af18-7958d91c672f\" (UID: \"3936549f-6055-4b0d-af18-7958d91c672f\") " Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.967146 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ee87b32-b776-445a-b3f5-d94c87430742-utilities\") pod \"certified-operators-nnzzh\" (UID: \"9ee87b32-b776-445a-b3f5-d94c87430742\") " pod="openshift-marketplace/certified-operators-nnzzh" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.967169 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ee87b32-b776-445a-b3f5-d94c87430742-catalog-content\") pod \"certified-operators-nnzzh\" (UID: \"9ee87b32-b776-445a-b3f5-d94c87430742\") " pod="openshift-marketplace/certified-operators-nnzzh" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.967200 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pc6m\" (UniqueName: \"kubernetes.io/projected/9ee87b32-b776-445a-b3f5-d94c87430742-kube-api-access-9pc6m\") pod \"certified-operators-nnzzh\" (UID: \"9ee87b32-b776-445a-b3f5-d94c87430742\") " pod="openshift-marketplace/certified-operators-nnzzh" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.967800 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3936549f-6055-4b0d-af18-7958d91c672f-config-volume" (OuterVolumeSpecName: "config-volume") pod "3936549f-6055-4b0d-af18-7958d91c672f" (UID: "3936549f-6055-4b0d-af18-7958d91c672f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.967917 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ee87b32-b776-445a-b3f5-d94c87430742-utilities\") pod \"certified-operators-nnzzh\" (UID: \"9ee87b32-b776-445a-b3f5-d94c87430742\") " pod="openshift-marketplace/certified-operators-nnzzh" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.968039 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ee87b32-b776-445a-b3f5-d94c87430742-catalog-content\") pod \"certified-operators-nnzzh\" (UID: \"9ee87b32-b776-445a-b3f5-d94c87430742\") " pod="openshift-marketplace/certified-operators-nnzzh" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.970547 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3936549f-6055-4b0d-af18-7958d91c672f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3936549f-6055-4b0d-af18-7958d91c672f" (UID: "3936549f-6055-4b0d-af18-7958d91c672f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.970768 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3936549f-6055-4b0d-af18-7958d91c672f-kube-api-access-wbhqh" (OuterVolumeSpecName: "kube-api-access-wbhqh") pod "3936549f-6055-4b0d-af18-7958d91c672f" (UID: "3936549f-6055-4b0d-af18-7958d91c672f"). InnerVolumeSpecName "kube-api-access-wbhqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.987762 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kwts6"] Dec 09 15:00:31 crc kubenswrapper[4555]: E1209 15:00:31.987994 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3936549f-6055-4b0d-af18-7958d91c672f" containerName="collect-profiles" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.988010 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="3936549f-6055-4b0d-af18-7958d91c672f" containerName="collect-profiles" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.988101 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="3936549f-6055-4b0d-af18-7958d91c672f" containerName="collect-profiles" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.988748 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kwts6" Dec 09 15:00:31 crc kubenswrapper[4555]: I1209 15:00:31.989220 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pc6m\" (UniqueName: \"kubernetes.io/projected/9ee87b32-b776-445a-b3f5-d94c87430742-kube-api-access-9pc6m\") pod \"certified-operators-nnzzh\" (UID: \"9ee87b32-b776-445a-b3f5-d94c87430742\") " pod="openshift-marketplace/certified-operators-nnzzh" Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.005608 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kwts6"] Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.037559 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bdb9r"] Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.067581 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c6gp\" (UniqueName: \"kubernetes.io/projected/b86a6402-17c6-4be0-b3ab-7d0bd357f8b2-kube-api-access-8c6gp\") pod \"community-operators-kwts6\" (UID: \"b86a6402-17c6-4be0-b3ab-7d0bd357f8b2\") " pod="openshift-marketplace/community-operators-kwts6" Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.067645 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b86a6402-17c6-4be0-b3ab-7d0bd357f8b2-catalog-content\") pod \"community-operators-kwts6\" (UID: \"b86a6402-17c6-4be0-b3ab-7d0bd357f8b2\") " pod="openshift-marketplace/community-operators-kwts6" Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.067663 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b86a6402-17c6-4be0-b3ab-7d0bd357f8b2-utilities\") pod \"community-operators-kwts6\" (UID: \"b86a6402-17c6-4be0-b3ab-7d0bd357f8b2\") " pod="openshift-marketplace/community-operators-kwts6" Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.067827 4555 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3936549f-6055-4b0d-af18-7958d91c672f-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.067855 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbhqh\" (UniqueName: \"kubernetes.io/projected/3936549f-6055-4b0d-af18-7958d91c672f-kube-api-access-wbhqh\") on node \"crc\" DevicePath \"\"" Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.067865 4555 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3936549f-6055-4b0d-af18-7958d91c672f-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.139063 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nnzzh" Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.168915 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c6gp\" (UniqueName: \"kubernetes.io/projected/b86a6402-17c6-4be0-b3ab-7d0bd357f8b2-kube-api-access-8c6gp\") pod \"community-operators-kwts6\" (UID: \"b86a6402-17c6-4be0-b3ab-7d0bd357f8b2\") " pod="openshift-marketplace/community-operators-kwts6" Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.168995 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b86a6402-17c6-4be0-b3ab-7d0bd357f8b2-catalog-content\") pod \"community-operators-kwts6\" (UID: \"b86a6402-17c6-4be0-b3ab-7d0bd357f8b2\") " pod="openshift-marketplace/community-operators-kwts6" Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.169024 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b86a6402-17c6-4be0-b3ab-7d0bd357f8b2-utilities\") pod \"community-operators-kwts6\" (UID: \"b86a6402-17c6-4be0-b3ab-7d0bd357f8b2\") " pod="openshift-marketplace/community-operators-kwts6" Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.169510 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b86a6402-17c6-4be0-b3ab-7d0bd357f8b2-utilities\") pod \"community-operators-kwts6\" (UID: \"b86a6402-17c6-4be0-b3ab-7d0bd357f8b2\") " pod="openshift-marketplace/community-operators-kwts6" Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.169983 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b86a6402-17c6-4be0-b3ab-7d0bd357f8b2-catalog-content\") pod \"community-operators-kwts6\" (UID: \"b86a6402-17c6-4be0-b3ab-7d0bd357f8b2\") " pod="openshift-marketplace/community-operators-kwts6" Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.172928 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.184084 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c6gp\" (UniqueName: \"kubernetes.io/projected/b86a6402-17c6-4be0-b3ab-7d0bd357f8b2-kube-api-access-8c6gp\") pod \"community-operators-kwts6\" (UID: \"b86a6402-17c6-4be0-b3ab-7d0bd357f8b2\") " pod="openshift-marketplace/community-operators-kwts6" Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.225003 4555 patch_prober.go:28] interesting pod/router-default-5444994796-pldb7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 15:00:32 crc kubenswrapper[4555]: [-]has-synced failed: reason withheld Dec 09 15:00:32 crc kubenswrapper[4555]: [+]process-running ok Dec 09 15:00:32 crc kubenswrapper[4555]: healthz check failed Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.225262 4555 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pldb7" podUID="33ad6890-1e95-4a46-9f40-00cf0443d4a1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.281738 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nnzzh"] Dec 09 15:00:32 crc kubenswrapper[4555]: W1209 15:00:32.289404 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ee87b32_b776_445a_b3f5_d94c87430742.slice/crio-4255d4d1a34060f2ad4652af104e6a193491cabc20a1518c7a3fae1ce746d0a1 WatchSource:0}: Error finding container 4255d4d1a34060f2ad4652af104e6a193491cabc20a1518c7a3fae1ce746d0a1: Status 404 returned error can't find the container with id 4255d4d1a34060f2ad4652af104e6a193491cabc20a1518c7a3fae1ce746d0a1 Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.316781 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kwts6" Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.320297 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mc787"] Dec 09 15:00:32 crc kubenswrapper[4555]: W1209 15:00:32.336631 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14dca34a_64a8_4190_aa9a_07c5cb2bdf33.slice/crio-d475c7287cf23068800595aa02c9ef28eb1d21373a0436cc3dcbbfac253d5807 WatchSource:0}: Error finding container d475c7287cf23068800595aa02c9ef28eb1d21373a0436cc3dcbbfac253d5807: Status 404 returned error can't find the container with id d475c7287cf23068800595aa02c9ef28eb1d21373a0436cc3dcbbfac253d5807 Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.476701 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kwts6"] Dec 09 15:00:32 crc kubenswrapper[4555]: W1209 15:00:32.518895 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb86a6402_17c6_4be0_b3ab_7d0bd357f8b2.slice/crio-73186d0f13e83df9c0000055980b9bb60c0df511ec683aa8744cca68418526ae WatchSource:0}: Error finding container 73186d0f13e83df9c0000055980b9bb60c0df511ec683aa8744cca68418526ae: Status 404 returned error can't find the container with id 73186d0f13e83df9c0000055980b9bb60c0df511ec683aa8744cca68418526ae Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.641225 4555 generic.go:334] "Generic (PLEG): container finished" podID="fd5cca88-9272-46b2-89f1-b0c2b44e2a9f" containerID="1eef4e539ca6e125894f0c0150478b564bac0d394786f6aa00d0b1ef2d5ae2b1" exitCode=0 Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.641296 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdb9r" event={"ID":"fd5cca88-9272-46b2-89f1-b0c2b44e2a9f","Type":"ContainerDied","Data":"1eef4e539ca6e125894f0c0150478b564bac0d394786f6aa00d0b1ef2d5ae2b1"} Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.641740 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdb9r" event={"ID":"fd5cca88-9272-46b2-89f1-b0c2b44e2a9f","Type":"ContainerStarted","Data":"1e7abbf7290011846f2420736a75def45163b58689e32ca69e75c8820223e8a7"} Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.643965 4555 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.646104 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" event={"ID":"eee41022-2a9c-41c8-bbc8-ff42874c4050","Type":"ContainerStarted","Data":"0ee304509ba0fda875c576dca9bb9b28ecb72ef36c79d40ef8e10259157efcc8"} Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.646157 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" event={"ID":"eee41022-2a9c-41c8-bbc8-ff42874c4050","Type":"ContainerStarted","Data":"7ed3ad04539c4041ff4a4e69a1a26dc71100e7c02fe83bef54acdffc8de5e5ae"} Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.646240 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.648044 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwts6" event={"ID":"b86a6402-17c6-4be0-b3ab-7d0bd357f8b2","Type":"ContainerStarted","Data":"29cf6f29221600b318f9a76c7097af53c14d428293bfd161b51a1a5a142423d8"} Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.648079 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwts6" event={"ID":"b86a6402-17c6-4be0-b3ab-7d0bd357f8b2","Type":"ContainerStarted","Data":"73186d0f13e83df9c0000055980b9bb60c0df511ec683aa8744cca68418526ae"} Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.650579 4555 generic.go:334] "Generic (PLEG): container finished" podID="9ee87b32-b776-445a-b3f5-d94c87430742" containerID="90665757b4fb62635f77fbf1b82155622b338fb1dd704f86e9f132cc8b148804" exitCode=0 Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.650649 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnzzh" event={"ID":"9ee87b32-b776-445a-b3f5-d94c87430742","Type":"ContainerDied","Data":"90665757b4fb62635f77fbf1b82155622b338fb1dd704f86e9f132cc8b148804"} Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.650681 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnzzh" event={"ID":"9ee87b32-b776-445a-b3f5-d94c87430742","Type":"ContainerStarted","Data":"4255d4d1a34060f2ad4652af104e6a193491cabc20a1518c7a3fae1ce746d0a1"} Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.652426 4555 generic.go:334] "Generic (PLEG): container finished" podID="14dca34a-64a8-4190-aa9a-07c5cb2bdf33" containerID="cf218e03cb5d7f91d68b277fb9b05d0ac2942940e11bbd8bc9073aa7c0ba1b29" exitCode=0 Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.652505 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc787" event={"ID":"14dca34a-64a8-4190-aa9a-07c5cb2bdf33","Type":"ContainerDied","Data":"cf218e03cb5d7f91d68b277fb9b05d0ac2942940e11bbd8bc9073aa7c0ba1b29"} Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.652527 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc787" event={"ID":"14dca34a-64a8-4190-aa9a-07c5cb2bdf33","Type":"ContainerStarted","Data":"d475c7287cf23068800595aa02c9ef28eb1d21373a0436cc3dcbbfac253d5807"} Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.654702 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-hhxsf" event={"ID":"3936549f-6055-4b0d-af18-7958d91c672f","Type":"ContainerDied","Data":"1cda284bb770690536cc9f7a11c1c0e55229741608b23da1143bdbb130173e27"} Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.654729 4555 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cda284bb770690536cc9f7a11c1c0e55229741608b23da1143bdbb130173e27" Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.655611 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-hhxsf" Dec 09 15:00:32 crc kubenswrapper[4555]: I1209 15:00:32.715908 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" podStartSLOduration=120.715863831 podStartE2EDuration="2m0.715863831s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:32.71496003 +0000 UTC m=+142.851672748" watchObservedRunningTime="2025-12-09 15:00:32.715863831 +0000 UTC m=+142.852576539" Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.212038 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.212689 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.218931 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.219578 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.224432 4555 patch_prober.go:28] interesting pod/router-default-5444994796-pldb7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 15:00:33 crc kubenswrapper[4555]: [-]has-synced failed: reason withheld Dec 09 15:00:33 crc kubenswrapper[4555]: [+]process-running ok Dec 09 15:00:33 crc kubenswrapper[4555]: healthz check failed Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.224482 4555 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pldb7" podUID="33ad6890-1e95-4a46-9f40-00cf0443d4a1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.226443 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.292144 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04dd4e81-06ee-4e2c-aab2-8906307ef074-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"04dd4e81-06ee-4e2c-aab2-8906307ef074\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.292200 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/04dd4e81-06ee-4e2c-aab2-8906307ef074-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"04dd4e81-06ee-4e2c-aab2-8906307ef074\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.393609 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04dd4e81-06ee-4e2c-aab2-8906307ef074-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"04dd4e81-06ee-4e2c-aab2-8906307ef074\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.393718 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/04dd4e81-06ee-4e2c-aab2-8906307ef074-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"04dd4e81-06ee-4e2c-aab2-8906307ef074\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.393806 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/04dd4e81-06ee-4e2c-aab2-8906307ef074-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"04dd4e81-06ee-4e2c-aab2-8906307ef074\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.415954 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04dd4e81-06ee-4e2c-aab2-8906307ef074-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"04dd4e81-06ee-4e2c-aab2-8906307ef074\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.524936 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.587451 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7d49r"] Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.591069 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7d49r" Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.592746 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.594296 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7d49r"] Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.595798 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af79ef0a-fda8-4009-b915-297808ea4398-catalog-content\") pod \"redhat-marketplace-7d49r\" (UID: \"af79ef0a-fda8-4009-b915-297808ea4398\") " pod="openshift-marketplace/redhat-marketplace-7d49r" Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.595865 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af79ef0a-fda8-4009-b915-297808ea4398-utilities\") pod \"redhat-marketplace-7d49r\" (UID: \"af79ef0a-fda8-4009-b915-297808ea4398\") " pod="openshift-marketplace/redhat-marketplace-7d49r" Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.595888 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbvv6\" (UniqueName: \"kubernetes.io/projected/af79ef0a-fda8-4009-b915-297808ea4398-kube-api-access-vbvv6\") pod \"redhat-marketplace-7d49r\" (UID: \"af79ef0a-fda8-4009-b915-297808ea4398\") " pod="openshift-marketplace/redhat-marketplace-7d49r" Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.666329 4555 generic.go:334] "Generic (PLEG): container finished" podID="b86a6402-17c6-4be0-b3ab-7d0bd357f8b2" containerID="29cf6f29221600b318f9a76c7097af53c14d428293bfd161b51a1a5a142423d8" exitCode=0 Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.666408 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwts6" event={"ID":"b86a6402-17c6-4be0-b3ab-7d0bd357f8b2","Type":"ContainerDied","Data":"29cf6f29221600b318f9a76c7097af53c14d428293bfd161b51a1a5a142423d8"} Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.698028 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af79ef0a-fda8-4009-b915-297808ea4398-catalog-content\") pod \"redhat-marketplace-7d49r\" (UID: \"af79ef0a-fda8-4009-b915-297808ea4398\") " pod="openshift-marketplace/redhat-marketplace-7d49r" Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.698281 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af79ef0a-fda8-4009-b915-297808ea4398-utilities\") pod \"redhat-marketplace-7d49r\" (UID: \"af79ef0a-fda8-4009-b915-297808ea4398\") " pod="openshift-marketplace/redhat-marketplace-7d49r" Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.698365 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbvv6\" (UniqueName: \"kubernetes.io/projected/af79ef0a-fda8-4009-b915-297808ea4398-kube-api-access-vbvv6\") pod \"redhat-marketplace-7d49r\" (UID: \"af79ef0a-fda8-4009-b915-297808ea4398\") " pod="openshift-marketplace/redhat-marketplace-7d49r" Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.698825 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af79ef0a-fda8-4009-b915-297808ea4398-utilities\") pod \"redhat-marketplace-7d49r\" (UID: \"af79ef0a-fda8-4009-b915-297808ea4398\") " pod="openshift-marketplace/redhat-marketplace-7d49r" Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.701013 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af79ef0a-fda8-4009-b915-297808ea4398-catalog-content\") pod \"redhat-marketplace-7d49r\" (UID: \"af79ef0a-fda8-4009-b915-297808ea4398\") " pod="openshift-marketplace/redhat-marketplace-7d49r" Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.744875 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbvv6\" (UniqueName: \"kubernetes.io/projected/af79ef0a-fda8-4009-b915-297808ea4398-kube-api-access-vbvv6\") pod \"redhat-marketplace-7d49r\" (UID: \"af79ef0a-fda8-4009-b915-297808ea4398\") " pod="openshift-marketplace/redhat-marketplace-7d49r" Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.747885 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 09 15:00:33 crc kubenswrapper[4555]: W1209 15:00:33.800724 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod04dd4e81_06ee_4e2c_aab2_8906307ef074.slice/crio-a0ff170dd5e72ef1bbda0c6f20e5ac49deb5142d9bb8179284b83cb19f9ba613 WatchSource:0}: Error finding container a0ff170dd5e72ef1bbda0c6f20e5ac49deb5142d9bb8179284b83cb19f9ba613: Status 404 returned error can't find the container with id a0ff170dd5e72ef1bbda0c6f20e5ac49deb5142d9bb8179284b83cb19f9ba613 Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.909642 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7d49r" Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.936663 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.937476 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.939290 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.939982 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.940122 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.986439 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kqnwk"] Dec 09 15:00:33 crc kubenswrapper[4555]: I1209 15:00:33.990687 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqnwk" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.001106 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqnwk"] Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.105881 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3f6bdc8-6545-4114-982c-f452951e5a0b-catalog-content\") pod \"redhat-marketplace-kqnwk\" (UID: \"d3f6bdc8-6545-4114-982c-f452951e5a0b\") " pod="openshift-marketplace/redhat-marketplace-kqnwk" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.106010 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkbrb\" (UniqueName: \"kubernetes.io/projected/d3f6bdc8-6545-4114-982c-f452951e5a0b-kube-api-access-gkbrb\") pod \"redhat-marketplace-kqnwk\" (UID: \"d3f6bdc8-6545-4114-982c-f452951e5a0b\") " pod="openshift-marketplace/redhat-marketplace-kqnwk" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.106057 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e0577ca-03c5-4ede-8c4a-190bac74ed06-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0e0577ca-03c5-4ede-8c4a-190bac74ed06\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.106104 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3f6bdc8-6545-4114-982c-f452951e5a0b-utilities\") pod \"redhat-marketplace-kqnwk\" (UID: \"d3f6bdc8-6545-4114-982c-f452951e5a0b\") " pod="openshift-marketplace/redhat-marketplace-kqnwk" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.106183 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e0577ca-03c5-4ede-8c4a-190bac74ed06-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0e0577ca-03c5-4ede-8c4a-190bac74ed06\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.207187 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3f6bdc8-6545-4114-982c-f452951e5a0b-utilities\") pod \"redhat-marketplace-kqnwk\" (UID: \"d3f6bdc8-6545-4114-982c-f452951e5a0b\") " pod="openshift-marketplace/redhat-marketplace-kqnwk" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.207267 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e0577ca-03c5-4ede-8c4a-190bac74ed06-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0e0577ca-03c5-4ede-8c4a-190bac74ed06\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.207296 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3f6bdc8-6545-4114-982c-f452951e5a0b-catalog-content\") pod \"redhat-marketplace-kqnwk\" (UID: \"d3f6bdc8-6545-4114-982c-f452951e5a0b\") " pod="openshift-marketplace/redhat-marketplace-kqnwk" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.207404 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e0577ca-03c5-4ede-8c4a-190bac74ed06-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0e0577ca-03c5-4ede-8c4a-190bac74ed06\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.207427 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkbrb\" (UniqueName: \"kubernetes.io/projected/d3f6bdc8-6545-4114-982c-f452951e5a0b-kube-api-access-gkbrb\") pod \"redhat-marketplace-kqnwk\" (UID: \"d3f6bdc8-6545-4114-982c-f452951e5a0b\") " pod="openshift-marketplace/redhat-marketplace-kqnwk" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.207499 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e0577ca-03c5-4ede-8c4a-190bac74ed06-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0e0577ca-03c5-4ede-8c4a-190bac74ed06\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.208510 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3f6bdc8-6545-4114-982c-f452951e5a0b-catalog-content\") pod \"redhat-marketplace-kqnwk\" (UID: \"d3f6bdc8-6545-4114-982c-f452951e5a0b\") " pod="openshift-marketplace/redhat-marketplace-kqnwk" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.209629 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3f6bdc8-6545-4114-982c-f452951e5a0b-utilities\") pod \"redhat-marketplace-kqnwk\" (UID: \"d3f6bdc8-6545-4114-982c-f452951e5a0b\") " pod="openshift-marketplace/redhat-marketplace-kqnwk" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.227055 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkbrb\" (UniqueName: \"kubernetes.io/projected/d3f6bdc8-6545-4114-982c-f452951e5a0b-kube-api-access-gkbrb\") pod \"redhat-marketplace-kqnwk\" (UID: \"d3f6bdc8-6545-4114-982c-f452951e5a0b\") " pod="openshift-marketplace/redhat-marketplace-kqnwk" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.227558 4555 patch_prober.go:28] interesting pod/router-default-5444994796-pldb7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 15:00:34 crc kubenswrapper[4555]: [-]has-synced failed: reason withheld Dec 09 15:00:34 crc kubenswrapper[4555]: [+]process-running ok Dec 09 15:00:34 crc kubenswrapper[4555]: healthz check failed Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.228642 4555 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pldb7" podUID="33ad6890-1e95-4a46-9f40-00cf0443d4a1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.229934 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e0577ca-03c5-4ede-8c4a-190bac74ed06-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0e0577ca-03c5-4ede-8c4a-190bac74ed06\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.252682 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.294932 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7d49r"] Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.306394 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqnwk" Dec 09 15:00:34 crc kubenswrapper[4555]: W1209 15:00:34.313180 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf79ef0a_fda8_4009_b915_297808ea4398.slice/crio-276a29ca5488592d7ef5b8e9413ef24bc37c27518ad68dd09944711293c88c88 WatchSource:0}: Error finding container 276a29ca5488592d7ef5b8e9413ef24bc37c27518ad68dd09944711293c88c88: Status 404 returned error can't find the container with id 276a29ca5488592d7ef5b8e9413ef24bc37c27518ad68dd09944711293c88c88 Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.541043 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.591397 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqnwk"] Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.591440 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dwd6f"] Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.592885 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dwd6f" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.598043 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.598858 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dwd6f"] Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.676246 4555 generic.go:334] "Generic (PLEG): container finished" podID="af79ef0a-fda8-4009-b915-297808ea4398" containerID="5117c692e72c56ee79972900b8daff62f2eec8c0ce9cb60398e10ca0c4581d24" exitCode=0 Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.676318 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7d49r" event={"ID":"af79ef0a-fda8-4009-b915-297808ea4398","Type":"ContainerDied","Data":"5117c692e72c56ee79972900b8daff62f2eec8c0ce9cb60398e10ca0c4581d24"} Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.676349 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7d49r" event={"ID":"af79ef0a-fda8-4009-b915-297808ea4398","Type":"ContainerStarted","Data":"276a29ca5488592d7ef5b8e9413ef24bc37c27518ad68dd09944711293c88c88"} Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.681982 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"04dd4e81-06ee-4e2c-aab2-8906307ef074","Type":"ContainerStarted","Data":"f203cd4bbda54715e4d6afe5e28819bfaf714921ec19a2d93172b9be05799953"} Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.682051 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"04dd4e81-06ee-4e2c-aab2-8906307ef074","Type":"ContainerStarted","Data":"a0ff170dd5e72ef1bbda0c6f20e5ac49deb5142d9bb8179284b83cb19f9ba613"} Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.686125 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0e0577ca-03c5-4ede-8c4a-190bac74ed06","Type":"ContainerStarted","Data":"6790f02d8e594b4060cfa459fa7cf91425ace23c292102a58b675caf42fd636a"} Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.687418 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqnwk" event={"ID":"d3f6bdc8-6545-4114-982c-f452951e5a0b","Type":"ContainerStarted","Data":"322aab1fca241fedf8c3bf6f45d018cdb762a33f0a63eac547bdc66c15c0ce8a"} Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.721648 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b3ae0d5-62f7-4219-b006-f6a33d913018-utilities\") pod \"redhat-operators-dwd6f\" (UID: \"5b3ae0d5-62f7-4219-b006-f6a33d913018\") " pod="openshift-marketplace/redhat-operators-dwd6f" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.721744 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj6wx\" (UniqueName: \"kubernetes.io/projected/5b3ae0d5-62f7-4219-b006-f6a33d913018-kube-api-access-rj6wx\") pod \"redhat-operators-dwd6f\" (UID: \"5b3ae0d5-62f7-4219-b006-f6a33d913018\") " pod="openshift-marketplace/redhat-operators-dwd6f" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.721791 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b3ae0d5-62f7-4219-b006-f6a33d913018-catalog-content\") pod \"redhat-operators-dwd6f\" (UID: \"5b3ae0d5-62f7-4219-b006-f6a33d913018\") " pod="openshift-marketplace/redhat-operators-dwd6f" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.811151 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtdft" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.823555 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b3ae0d5-62f7-4219-b006-f6a33d913018-catalog-content\") pod \"redhat-operators-dwd6f\" (UID: \"5b3ae0d5-62f7-4219-b006-f6a33d913018\") " pod="openshift-marketplace/redhat-operators-dwd6f" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.823699 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b3ae0d5-62f7-4219-b006-f6a33d913018-utilities\") pod \"redhat-operators-dwd6f\" (UID: \"5b3ae0d5-62f7-4219-b006-f6a33d913018\") " pod="openshift-marketplace/redhat-operators-dwd6f" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.823730 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj6wx\" (UniqueName: \"kubernetes.io/projected/5b3ae0d5-62f7-4219-b006-f6a33d913018-kube-api-access-rj6wx\") pod \"redhat-operators-dwd6f\" (UID: \"5b3ae0d5-62f7-4219-b006-f6a33d913018\") " pod="openshift-marketplace/redhat-operators-dwd6f" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.824251 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b3ae0d5-62f7-4219-b006-f6a33d913018-catalog-content\") pod \"redhat-operators-dwd6f\" (UID: \"5b3ae0d5-62f7-4219-b006-f6a33d913018\") " pod="openshift-marketplace/redhat-operators-dwd6f" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.826046 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b3ae0d5-62f7-4219-b006-f6a33d913018-utilities\") pod \"redhat-operators-dwd6f\" (UID: \"5b3ae0d5-62f7-4219-b006-f6a33d913018\") " pod="openshift-marketplace/redhat-operators-dwd6f" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.828170 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=1.8281462510000002 podStartE2EDuration="1.828146251s" podCreationTimestamp="2025-12-09 15:00:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:34.709198414 +0000 UTC m=+144.845911131" watchObservedRunningTime="2025-12-09 15:00:34.828146251 +0000 UTC m=+144.964858970" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.842351 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj6wx\" (UniqueName: \"kubernetes.io/projected/5b3ae0d5-62f7-4219-b006-f6a33d913018-kube-api-access-rj6wx\") pod \"redhat-operators-dwd6f\" (UID: \"5b3ae0d5-62f7-4219-b006-f6a33d913018\") " pod="openshift-marketplace/redhat-operators-dwd6f" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.947663 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dwd6f" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.988722 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vpsml"] Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.990035 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vpsml" Dec 09 15:00:34 crc kubenswrapper[4555]: I1209 15:00:34.993676 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vpsml"] Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.130257 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qv5t\" (UniqueName: \"kubernetes.io/projected/99932cc2-84e4-4996-84ab-55407f09fa3a-kube-api-access-6qv5t\") pod \"redhat-operators-vpsml\" (UID: \"99932cc2-84e4-4996-84ab-55407f09fa3a\") " pod="openshift-marketplace/redhat-operators-vpsml" Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.130422 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.130518 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99932cc2-84e4-4996-84ab-55407f09fa3a-utilities\") pod \"redhat-operators-vpsml\" (UID: \"99932cc2-84e4-4996-84ab-55407f09fa3a\") " pod="openshift-marketplace/redhat-operators-vpsml" Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.130536 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99932cc2-84e4-4996-84ab-55407f09fa3a-catalog-content\") pod \"redhat-operators-vpsml\" (UID: \"99932cc2-84e4-4996-84ab-55407f09fa3a\") " pod="openshift-marketplace/redhat-operators-vpsml" Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.130641 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.131420 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.135280 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.166504 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-vbr87" Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.225453 4555 patch_prober.go:28] interesting pod/router-default-5444994796-pldb7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 15:00:35 crc kubenswrapper[4555]: [-]has-synced failed: reason withheld Dec 09 15:00:35 crc kubenswrapper[4555]: [+]process-running ok Dec 09 15:00:35 crc kubenswrapper[4555]: healthz check failed Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.225521 4555 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pldb7" podUID="33ad6890-1e95-4a46-9f40-00cf0443d4a1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.232099 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.232133 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qv5t\" (UniqueName: \"kubernetes.io/projected/99932cc2-84e4-4996-84ab-55407f09fa3a-kube-api-access-6qv5t\") pod \"redhat-operators-vpsml\" (UID: \"99932cc2-84e4-4996-84ab-55407f09fa3a\") " pod="openshift-marketplace/redhat-operators-vpsml" Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.232199 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99932cc2-84e4-4996-84ab-55407f09fa3a-utilities\") pod \"redhat-operators-vpsml\" (UID: \"99932cc2-84e4-4996-84ab-55407f09fa3a\") " pod="openshift-marketplace/redhat-operators-vpsml" Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.232216 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99932cc2-84e4-4996-84ab-55407f09fa3a-catalog-content\") pod \"redhat-operators-vpsml\" (UID: \"99932cc2-84e4-4996-84ab-55407f09fa3a\") " pod="openshift-marketplace/redhat-operators-vpsml" Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.232259 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.233327 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99932cc2-84e4-4996-84ab-55407f09fa3a-catalog-content\") pod \"redhat-operators-vpsml\" (UID: \"99932cc2-84e4-4996-84ab-55407f09fa3a\") " pod="openshift-marketplace/redhat-operators-vpsml" Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.233407 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99932cc2-84e4-4996-84ab-55407f09fa3a-utilities\") pod \"redhat-operators-vpsml\" (UID: \"99932cc2-84e4-4996-84ab-55407f09fa3a\") " pod="openshift-marketplace/redhat-operators-vpsml" Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.235411 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.235920 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.245852 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qv5t\" (UniqueName: \"kubernetes.io/projected/99932cc2-84e4-4996-84ab-55407f09fa3a-kube-api-access-6qv5t\") pod \"redhat-operators-vpsml\" (UID: \"99932cc2-84e4-4996-84ab-55407f09fa3a\") " pod="openshift-marketplace/redhat-operators-vpsml" Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.271665 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.276890 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.281070 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.309093 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vpsml" Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.382876 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dwd6f"] Dec 09 15:00:35 crc kubenswrapper[4555]: W1209 15:00:35.436970 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b3ae0d5_62f7_4219_b006_f6a33d913018.slice/crio-f87e573a8339f21fd33bcff79510765cbd0fbea1c20eba86027f37f21ad75ea8 WatchSource:0}: Error finding container f87e573a8339f21fd33bcff79510765cbd0fbea1c20eba86027f37f21ad75ea8: Status 404 returned error can't find the container with id f87e573a8339f21fd33bcff79510765cbd0fbea1c20eba86027f37f21ad75ea8 Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.510595 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-vhncm" Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.510656 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-vhncm" Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.513464 4555 patch_prober.go:28] interesting pod/console-f9d7485db-vhncm container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.21:8443/health\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.513521 4555 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-vhncm" podUID="a7fc2be6-0cb9-4fad-ac58-37004264a563" containerName="console" probeResult="failure" output="Get \"https://10.217.0.21:8443/health\": dial tcp 10.217.0.21:8443: connect: connection refused" Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.696543 4555 generic.go:334] "Generic (PLEG): container finished" podID="5b3ae0d5-62f7-4219-b006-f6a33d913018" containerID="0890db3247a8baaba0f4d0ff9a3f2b9b777f2a6f361280e6fb2e22cee4c464a5" exitCode=0 Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.696620 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dwd6f" event={"ID":"5b3ae0d5-62f7-4219-b006-f6a33d913018","Type":"ContainerDied","Data":"0890db3247a8baaba0f4d0ff9a3f2b9b777f2a6f361280e6fb2e22cee4c464a5"} Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.696651 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dwd6f" event={"ID":"5b3ae0d5-62f7-4219-b006-f6a33d913018","Type":"ContainerStarted","Data":"f87e573a8339f21fd33bcff79510765cbd0fbea1c20eba86027f37f21ad75ea8"} Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.707120 4555 generic.go:334] "Generic (PLEG): container finished" podID="04dd4e81-06ee-4e2c-aab2-8906307ef074" containerID="f203cd4bbda54715e4d6afe5e28819bfaf714921ec19a2d93172b9be05799953" exitCode=0 Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.707203 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"04dd4e81-06ee-4e2c-aab2-8906307ef074","Type":"ContainerDied","Data":"f203cd4bbda54715e4d6afe5e28819bfaf714921ec19a2d93172b9be05799953"} Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.709430 4555 generic.go:334] "Generic (PLEG): container finished" podID="0e0577ca-03c5-4ede-8c4a-190bac74ed06" containerID="4c0a71bea116876fa493cd618965913fc14188826a53c2b4ff74407d123d5588" exitCode=0 Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.709504 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0e0577ca-03c5-4ede-8c4a-190bac74ed06","Type":"ContainerDied","Data":"4c0a71bea116876fa493cd618965913fc14188826a53c2b4ff74407d123d5588"} Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.715746 4555 generic.go:334] "Generic (PLEG): container finished" podID="d3f6bdc8-6545-4114-982c-f452951e5a0b" containerID="7ba931d6701dd687698d85c9c68091f05fe060614907d61978a91d24d3047ed8" exitCode=0 Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.715778 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqnwk" event={"ID":"d3f6bdc8-6545-4114-982c-f452951e5a0b","Type":"ContainerDied","Data":"7ba931d6701dd687698d85c9c68091f05fe060614907d61978a91d24d3047ed8"} Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.763674 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.770435 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-xdwlv" Dec 09 15:00:35 crc kubenswrapper[4555]: I1209 15:00:35.811402 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vpsml"] Dec 09 15:00:35 crc kubenswrapper[4555]: W1209 15:00:35.927990 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99932cc2_84e4_4996_84ab_55407f09fa3a.slice/crio-0be65d01bb16c4e697eaf84fc81f3507d7ac12fbd86cf3aadbc9d1558a56d655 WatchSource:0}: Error finding container 0be65d01bb16c4e697eaf84fc81f3507d7ac12fbd86cf3aadbc9d1558a56d655: Status 404 returned error can't find the container with id 0be65d01bb16c4e697eaf84fc81f3507d7ac12fbd86cf3aadbc9d1558a56d655 Dec 09 15:00:36 crc kubenswrapper[4555]: I1209 15:00:36.221612 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-pldb7" Dec 09 15:00:36 crc kubenswrapper[4555]: I1209 15:00:36.224332 4555 patch_prober.go:28] interesting pod/router-default-5444994796-pldb7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 15:00:36 crc kubenswrapper[4555]: [-]has-synced failed: reason withheld Dec 09 15:00:36 crc kubenswrapper[4555]: [+]process-running ok Dec 09 15:00:36 crc kubenswrapper[4555]: healthz check failed Dec 09 15:00:36 crc kubenswrapper[4555]: I1209 15:00:36.224370 4555 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pldb7" podUID="33ad6890-1e95-4a46-9f40-00cf0443d4a1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 15:00:36 crc kubenswrapper[4555]: I1209 15:00:36.724055 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b606f1265ce1e41c81e2f890f36e116e94a1587d04d0f3fe3de4d34c6a087454"} Dec 09 15:00:36 crc kubenswrapper[4555]: I1209 15:00:36.724106 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b7495aec21583637e571882674106d548eb6c79a2777e4f61dbdc93e6c2a1278"} Dec 09 15:00:36 crc kubenswrapper[4555]: I1209 15:00:36.727349 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"0e18dd862867a0617a5048cb93bdb9bed8106c2fea5d5ebc758401fca8ca10cb"} Dec 09 15:00:36 crc kubenswrapper[4555]: I1209 15:00:36.727429 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4b12f78e66e59c447b9fbbd2c2c65739eb8ec558c5b19833dd8ebcf8b2627836"} Dec 09 15:00:36 crc kubenswrapper[4555]: I1209 15:00:36.727641 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:36 crc kubenswrapper[4555]: I1209 15:00:36.733678 4555 generic.go:334] "Generic (PLEG): container finished" podID="99932cc2-84e4-4996-84ab-55407f09fa3a" containerID="88fa7e5250a60cc294d46717a601f6896e0bd90c9d9ad4ae09c96a77d53d0ad4" exitCode=0 Dec 09 15:00:36 crc kubenswrapper[4555]: I1209 15:00:36.733837 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpsml" event={"ID":"99932cc2-84e4-4996-84ab-55407f09fa3a","Type":"ContainerDied","Data":"88fa7e5250a60cc294d46717a601f6896e0bd90c9d9ad4ae09c96a77d53d0ad4"} Dec 09 15:00:36 crc kubenswrapper[4555]: I1209 15:00:36.733882 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpsml" event={"ID":"99932cc2-84e4-4996-84ab-55407f09fa3a","Type":"ContainerStarted","Data":"0be65d01bb16c4e697eaf84fc81f3507d7ac12fbd86cf3aadbc9d1558a56d655"} Dec 09 15:00:36 crc kubenswrapper[4555]: I1209 15:00:36.741202 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"9a1122e668ab2a5bef476301dfa586a912501d615ec8ffcfb011b864fe9697fe"} Dec 09 15:00:36 crc kubenswrapper[4555]: I1209 15:00:36.741255 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e267f5400dbfc991e8dc1ccf9c1161fd6ee8d48c6fdc139d27fdd054c71df14a"} Dec 09 15:00:37 crc kubenswrapper[4555]: I1209 15:00:37.232923 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-pldb7" Dec 09 15:00:37 crc kubenswrapper[4555]: I1209 15:00:37.238820 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-pldb7" Dec 09 15:00:39 crc kubenswrapper[4555]: I1209 15:00:39.468061 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 15:00:39 crc kubenswrapper[4555]: I1209 15:00:39.610328 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e0577ca-03c5-4ede-8c4a-190bac74ed06-kubelet-dir\") pod \"0e0577ca-03c5-4ede-8c4a-190bac74ed06\" (UID: \"0e0577ca-03c5-4ede-8c4a-190bac74ed06\") " Dec 09 15:00:39 crc kubenswrapper[4555]: I1209 15:00:39.610380 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e0577ca-03c5-4ede-8c4a-190bac74ed06-kube-api-access\") pod \"0e0577ca-03c5-4ede-8c4a-190bac74ed06\" (UID: \"0e0577ca-03c5-4ede-8c4a-190bac74ed06\") " Dec 09 15:00:39 crc kubenswrapper[4555]: I1209 15:00:39.610625 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e0577ca-03c5-4ede-8c4a-190bac74ed06-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0e0577ca-03c5-4ede-8c4a-190bac74ed06" (UID: "0e0577ca-03c5-4ede-8c4a-190bac74ed06"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:00:39 crc kubenswrapper[4555]: I1209 15:00:39.615474 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e0577ca-03c5-4ede-8c4a-190bac74ed06-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0e0577ca-03c5-4ede-8c4a-190bac74ed06" (UID: "0e0577ca-03c5-4ede-8c4a-190bac74ed06"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:00:39 crc kubenswrapper[4555]: I1209 15:00:39.711763 4555 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e0577ca-03c5-4ede-8c4a-190bac74ed06-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 09 15:00:39 crc kubenswrapper[4555]: I1209 15:00:39.711786 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e0577ca-03c5-4ede-8c4a-190bac74ed06-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 15:00:39 crc kubenswrapper[4555]: I1209 15:00:39.772877 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0e0577ca-03c5-4ede-8c4a-190bac74ed06","Type":"ContainerDied","Data":"6790f02d8e594b4060cfa459fa7cf91425ace23c292102a58b675caf42fd636a"} Dec 09 15:00:39 crc kubenswrapper[4555]: I1209 15:00:39.772941 4555 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6790f02d8e594b4060cfa459fa7cf91425ace23c292102a58b675caf42fd636a" Dec 09 15:00:39 crc kubenswrapper[4555]: I1209 15:00:39.773049 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 15:00:39 crc kubenswrapper[4555]: I1209 15:00:39.942658 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 15:00:40 crc kubenswrapper[4555]: I1209 15:00:40.116426 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04dd4e81-06ee-4e2c-aab2-8906307ef074-kube-api-access\") pod \"04dd4e81-06ee-4e2c-aab2-8906307ef074\" (UID: \"04dd4e81-06ee-4e2c-aab2-8906307ef074\") " Dec 09 15:00:40 crc kubenswrapper[4555]: I1209 15:00:40.116475 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/04dd4e81-06ee-4e2c-aab2-8906307ef074-kubelet-dir\") pod \"04dd4e81-06ee-4e2c-aab2-8906307ef074\" (UID: \"04dd4e81-06ee-4e2c-aab2-8906307ef074\") " Dec 09 15:00:40 crc kubenswrapper[4555]: I1209 15:00:40.116833 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/04dd4e81-06ee-4e2c-aab2-8906307ef074-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "04dd4e81-06ee-4e2c-aab2-8906307ef074" (UID: "04dd4e81-06ee-4e2c-aab2-8906307ef074"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:00:40 crc kubenswrapper[4555]: I1209 15:00:40.119597 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04dd4e81-06ee-4e2c-aab2-8906307ef074-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "04dd4e81-06ee-4e2c-aab2-8906307ef074" (UID: "04dd4e81-06ee-4e2c-aab2-8906307ef074"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:00:40 crc kubenswrapper[4555]: I1209 15:00:40.218844 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04dd4e81-06ee-4e2c-aab2-8906307ef074-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 15:00:40 crc kubenswrapper[4555]: I1209 15:00:40.218895 4555 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/04dd4e81-06ee-4e2c-aab2-8906307ef074-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 09 15:00:40 crc kubenswrapper[4555]: I1209 15:00:40.784820 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"04dd4e81-06ee-4e2c-aab2-8906307ef074","Type":"ContainerDied","Data":"a0ff170dd5e72ef1bbda0c6f20e5ac49deb5142d9bb8179284b83cb19f9ba613"} Dec 09 15:00:40 crc kubenswrapper[4555]: I1209 15:00:40.785061 4555 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0ff170dd5e72ef1bbda0c6f20e5ac49deb5142d9bb8179284b83cb19f9ba613" Dec 09 15:00:40 crc kubenswrapper[4555]: I1209 15:00:40.784858 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 15:00:41 crc kubenswrapper[4555]: I1209 15:00:41.307960 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-9l7sb" Dec 09 15:00:45 crc kubenswrapper[4555]: I1209 15:00:45.514603 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-vhncm" Dec 09 15:00:45 crc kubenswrapper[4555]: I1209 15:00:45.518131 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-vhncm" Dec 09 15:00:48 crc kubenswrapper[4555]: I1209 15:00:48.824915 4555 generic.go:334] "Generic (PLEG): container finished" podID="14dca34a-64a8-4190-aa9a-07c5cb2bdf33" containerID="bbd7dd5281f9d575736eacd577f826add0efd6e39abe6e8f4740fc58244951d4" exitCode=0 Dec 09 15:00:48 crc kubenswrapper[4555]: I1209 15:00:48.825000 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc787" event={"ID":"14dca34a-64a8-4190-aa9a-07c5cb2bdf33","Type":"ContainerDied","Data":"bbd7dd5281f9d575736eacd577f826add0efd6e39abe6e8f4740fc58244951d4"} Dec 09 15:00:48 crc kubenswrapper[4555]: I1209 15:00:48.833816 4555 generic.go:334] "Generic (PLEG): container finished" podID="af79ef0a-fda8-4009-b915-297808ea4398" containerID="88b8aa6eaa59af9e33fdc13805932789b6f19cb1de086b3f84c70abf6859e5d9" exitCode=0 Dec 09 15:00:48 crc kubenswrapper[4555]: I1209 15:00:48.833913 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7d49r" event={"ID":"af79ef0a-fda8-4009-b915-297808ea4398","Type":"ContainerDied","Data":"88b8aa6eaa59af9e33fdc13805932789b6f19cb1de086b3f84c70abf6859e5d9"} Dec 09 15:00:48 crc kubenswrapper[4555]: I1209 15:00:48.835919 4555 generic.go:334] "Generic (PLEG): container finished" podID="9ee87b32-b776-445a-b3f5-d94c87430742" containerID="bf6111fac6cca6f155f479b2c907b1c7b1defd4847fca4fb74b7b04ff01aa821" exitCode=0 Dec 09 15:00:48 crc kubenswrapper[4555]: I1209 15:00:48.835947 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnzzh" event={"ID":"9ee87b32-b776-445a-b3f5-d94c87430742","Type":"ContainerDied","Data":"bf6111fac6cca6f155f479b2c907b1c7b1defd4847fca4fb74b7b04ff01aa821"} Dec 09 15:00:49 crc kubenswrapper[4555]: I1209 15:00:49.842169 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7d49r" event={"ID":"af79ef0a-fda8-4009-b915-297808ea4398","Type":"ContainerStarted","Data":"15719dafad2ef5fef86c9a7f2f4402239d8a2237de04f545d7663b71c380cc85"} Dec 09 15:00:49 crc kubenswrapper[4555]: I1209 15:00:49.845146 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnzzh" event={"ID":"9ee87b32-b776-445a-b3f5-d94c87430742","Type":"ContainerStarted","Data":"173e2bac19f98e044889ad8bf0a8574a7f4395a10a08eca65ef4e48092867dd6"} Dec 09 15:00:49 crc kubenswrapper[4555]: I1209 15:00:49.848142 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc787" event={"ID":"14dca34a-64a8-4190-aa9a-07c5cb2bdf33","Type":"ContainerStarted","Data":"f654a0b4c4ba6add363d7b89f0b30e83beacdd62392ff79cd31849ab5e4f0de7"} Dec 09 15:00:49 crc kubenswrapper[4555]: I1209 15:00:49.857983 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7d49r" podStartSLOduration=2.08278429 podStartE2EDuration="16.857971543s" podCreationTimestamp="2025-12-09 15:00:33 +0000 UTC" firstStartedPulling="2025-12-09 15:00:34.688834492 +0000 UTC m=+144.825547210" lastFinishedPulling="2025-12-09 15:00:49.464021745 +0000 UTC m=+159.600734463" observedRunningTime="2025-12-09 15:00:49.856405317 +0000 UTC m=+159.993118034" watchObservedRunningTime="2025-12-09 15:00:49.857971543 +0000 UTC m=+159.994684261" Dec 09 15:00:49 crc kubenswrapper[4555]: I1209 15:00:49.873736 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nnzzh" podStartSLOduration=2.0999617170000002 podStartE2EDuration="18.873721624s" podCreationTimestamp="2025-12-09 15:00:31 +0000 UTC" firstStartedPulling="2025-12-09 15:00:32.651680723 +0000 UTC m=+142.788393440" lastFinishedPulling="2025-12-09 15:00:49.425440629 +0000 UTC m=+159.562153347" observedRunningTime="2025-12-09 15:00:49.870849836 +0000 UTC m=+160.007562553" watchObservedRunningTime="2025-12-09 15:00:49.873721624 +0000 UTC m=+160.010434342" Dec 09 15:00:49 crc kubenswrapper[4555]: I1209 15:00:49.884228 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mc787" podStartSLOduration=2.119898741 podStartE2EDuration="18.884206785s" podCreationTimestamp="2025-12-09 15:00:31 +0000 UTC" firstStartedPulling="2025-12-09 15:00:32.65353261 +0000 UTC m=+142.790245328" lastFinishedPulling="2025-12-09 15:00:49.417840653 +0000 UTC m=+159.554553372" observedRunningTime="2025-12-09 15:00:49.883424052 +0000 UTC m=+160.020136770" watchObservedRunningTime="2025-12-09 15:00:49.884206785 +0000 UTC m=+160.020919503" Dec 09 15:00:51 crc kubenswrapper[4555]: I1209 15:00:51.497844 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:00:51 crc kubenswrapper[4555]: I1209 15:00:51.919657 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mc787" Dec 09 15:00:51 crc kubenswrapper[4555]: I1209 15:00:51.919914 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mc787" Dec 09 15:00:51 crc kubenswrapper[4555]: I1209 15:00:51.994997 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mc787" Dec 09 15:00:52 crc kubenswrapper[4555]: I1209 15:00:52.139531 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nnzzh" Dec 09 15:00:52 crc kubenswrapper[4555]: I1209 15:00:52.139572 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nnzzh" Dec 09 15:00:52 crc kubenswrapper[4555]: I1209 15:00:52.166771 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nnzzh" Dec 09 15:00:53 crc kubenswrapper[4555]: I1209 15:00:53.910692 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7d49r" Dec 09 15:00:53 crc kubenswrapper[4555]: I1209 15:00:53.910731 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7d49r" Dec 09 15:00:53 crc kubenswrapper[4555]: I1209 15:00:53.939458 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7d49r" Dec 09 15:00:54 crc kubenswrapper[4555]: I1209 15:00:54.400448 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/35f7e7bd-b667-42fa-b040-4a186e4570e3-metrics-certs\") pod \"network-metrics-daemon-6sc2q\" (UID: \"35f7e7bd-b667-42fa-b040-4a186e4570e3\") " pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 15:00:54 crc kubenswrapper[4555]: I1209 15:00:54.405418 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/35f7e7bd-b667-42fa-b040-4a186e4570e3-metrics-certs\") pod \"network-metrics-daemon-6sc2q\" (UID: \"35f7e7bd-b667-42fa-b040-4a186e4570e3\") " pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 15:00:54 crc kubenswrapper[4555]: I1209 15:00:54.570996 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sc2q" Dec 09 15:00:54 crc kubenswrapper[4555]: I1209 15:00:54.902707 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7d49r" Dec 09 15:00:54 crc kubenswrapper[4555]: I1209 15:00:54.942262 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6sc2q"] Dec 09 15:00:55 crc kubenswrapper[4555]: I1209 15:00:55.531221 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:00:55 crc kubenswrapper[4555]: I1209 15:00:55.531262 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:00:55 crc kubenswrapper[4555]: I1209 15:00:55.878838 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6sc2q" event={"ID":"35f7e7bd-b667-42fa-b040-4a186e4570e3","Type":"ContainerStarted","Data":"2347a3306b430b98b4bc5b2fe6d0af99c04ebe534414f5c3eeba8cc6a3bca97f"} Dec 09 15:00:55 crc kubenswrapper[4555]: I1209 15:00:55.878874 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6sc2q" event={"ID":"35f7e7bd-b667-42fa-b040-4a186e4570e3","Type":"ContainerStarted","Data":"b6edd29b263d620b1e13c6c6894751f0d39e5d8d81365cf3523ef176a98fb447"} Dec 09 15:00:55 crc kubenswrapper[4555]: I1209 15:00:55.878886 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6sc2q" event={"ID":"35f7e7bd-b667-42fa-b040-4a186e4570e3","Type":"ContainerStarted","Data":"ae0ce0d9ae02829ea364f06da59dcebd6ca1bf656f1fc2ac050a763b556ced99"} Dec 09 15:00:55 crc kubenswrapper[4555]: I1209 15:00:55.892249 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6sc2q" podStartSLOduration=143.892235062 podStartE2EDuration="2m23.892235062s" podCreationTimestamp="2025-12-09 14:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:55.889900549 +0000 UTC m=+166.026613267" watchObservedRunningTime="2025-12-09 15:00:55.892235062 +0000 UTC m=+166.028947779" Dec 09 15:01:01 crc kubenswrapper[4555]: I1209 15:01:01.907761 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpsml" event={"ID":"99932cc2-84e4-4996-84ab-55407f09fa3a","Type":"ContainerStarted","Data":"01b17f78140a90febfd1c80a28c6bec5aa56082e46b0ca339f880856ff1276ff"} Dec 09 15:01:01 crc kubenswrapper[4555]: I1209 15:01:01.950830 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mc787" Dec 09 15:01:02 crc kubenswrapper[4555]: I1209 15:01:02.166738 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nnzzh" Dec 09 15:01:02 crc kubenswrapper[4555]: I1209 15:01:02.913421 4555 generic.go:334] "Generic (PLEG): container finished" podID="99932cc2-84e4-4996-84ab-55407f09fa3a" containerID="01b17f78140a90febfd1c80a28c6bec5aa56082e46b0ca339f880856ff1276ff" exitCode=0 Dec 09 15:01:02 crc kubenswrapper[4555]: I1209 15:01:02.913458 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpsml" event={"ID":"99932cc2-84e4-4996-84ab-55407f09fa3a","Type":"ContainerDied","Data":"01b17f78140a90febfd1c80a28c6bec5aa56082e46b0ca339f880856ff1276ff"} Dec 09 15:01:03 crc kubenswrapper[4555]: I1209 15:01:03.918834 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpsml" event={"ID":"99932cc2-84e4-4996-84ab-55407f09fa3a","Type":"ContainerStarted","Data":"8776f9f010d5bfb704640cf682bb9dc79a3277b20c5f23d9a9d7a103bf7e2cb0"} Dec 09 15:01:03 crc kubenswrapper[4555]: I1209 15:01:03.919875 4555 generic.go:334] "Generic (PLEG): container finished" podID="fd5cca88-9272-46b2-89f1-b0c2b44e2a9f" containerID="179c6fd2fbf50210d96266ee240dfacdb0a2b5a8a35f6d7f9336a6e50cbe18c8" exitCode=0 Dec 09 15:01:03 crc kubenswrapper[4555]: I1209 15:01:03.919904 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdb9r" event={"ID":"fd5cca88-9272-46b2-89f1-b0c2b44e2a9f","Type":"ContainerDied","Data":"179c6fd2fbf50210d96266ee240dfacdb0a2b5a8a35f6d7f9336a6e50cbe18c8"} Dec 09 15:01:03 crc kubenswrapper[4555]: I1209 15:01:03.932503 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vpsml" podStartSLOduration=5.852610906 podStartE2EDuration="29.932489643s" podCreationTimestamp="2025-12-09 15:00:34 +0000 UTC" firstStartedPulling="2025-12-09 15:00:39.414847011 +0000 UTC m=+149.551559729" lastFinishedPulling="2025-12-09 15:01:03.494725748 +0000 UTC m=+173.631438466" observedRunningTime="2025-12-09 15:01:03.930855708 +0000 UTC m=+174.067568427" watchObservedRunningTime="2025-12-09 15:01:03.932489643 +0000 UTC m=+174.069202361" Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.391963 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nnzzh"] Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.392320 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nnzzh" podUID="9ee87b32-b776-445a-b3f5-d94c87430742" containerName="registry-server" containerID="cri-o://173e2bac19f98e044889ad8bf0a8574a7f4395a10a08eca65ef4e48092867dd6" gracePeriod=2 Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.768765 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nnzzh" Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.819556 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pc6m\" (UniqueName: \"kubernetes.io/projected/9ee87b32-b776-445a-b3f5-d94c87430742-kube-api-access-9pc6m\") pod \"9ee87b32-b776-445a-b3f5-d94c87430742\" (UID: \"9ee87b32-b776-445a-b3f5-d94c87430742\") " Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.819663 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ee87b32-b776-445a-b3f5-d94c87430742-catalog-content\") pod \"9ee87b32-b776-445a-b3f5-d94c87430742\" (UID: \"9ee87b32-b776-445a-b3f5-d94c87430742\") " Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.819835 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ee87b32-b776-445a-b3f5-d94c87430742-utilities\") pod \"9ee87b32-b776-445a-b3f5-d94c87430742\" (UID: \"9ee87b32-b776-445a-b3f5-d94c87430742\") " Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.820465 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ee87b32-b776-445a-b3f5-d94c87430742-utilities" (OuterVolumeSpecName: "utilities") pod "9ee87b32-b776-445a-b3f5-d94c87430742" (UID: "9ee87b32-b776-445a-b3f5-d94c87430742"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.820742 4555 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ee87b32-b776-445a-b3f5-d94c87430742-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.824134 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ee87b32-b776-445a-b3f5-d94c87430742-kube-api-access-9pc6m" (OuterVolumeSpecName: "kube-api-access-9pc6m") pod "9ee87b32-b776-445a-b3f5-d94c87430742" (UID: "9ee87b32-b776-445a-b3f5-d94c87430742"). InnerVolumeSpecName "kube-api-access-9pc6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.856927 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ee87b32-b776-445a-b3f5-d94c87430742-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ee87b32-b776-445a-b3f5-d94c87430742" (UID: "9ee87b32-b776-445a-b3f5-d94c87430742"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.921930 4555 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ee87b32-b776-445a-b3f5-d94c87430742-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.921961 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pc6m\" (UniqueName: \"kubernetes.io/projected/9ee87b32-b776-445a-b3f5-d94c87430742-kube-api-access-9pc6m\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.925427 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdb9r" event={"ID":"fd5cca88-9272-46b2-89f1-b0c2b44e2a9f","Type":"ContainerStarted","Data":"4ec27fbe6f3735de959aa11530f42a6fce5037a4db3e8eaa07ccb4da6b1dd517"} Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.927026 4555 generic.go:334] "Generic (PLEG): container finished" podID="5b3ae0d5-62f7-4219-b006-f6a33d913018" containerID="eea80f8422e2e199412dce75228dfb0549eb133fedcf9987ada61eac543bf64e" exitCode=0 Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.927088 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dwd6f" event={"ID":"5b3ae0d5-62f7-4219-b006-f6a33d913018","Type":"ContainerDied","Data":"eea80f8422e2e199412dce75228dfb0549eb133fedcf9987ada61eac543bf64e"} Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.929852 4555 generic.go:334] "Generic (PLEG): container finished" podID="9ee87b32-b776-445a-b3f5-d94c87430742" containerID="173e2bac19f98e044889ad8bf0a8574a7f4395a10a08eca65ef4e48092867dd6" exitCode=0 Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.929908 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nnzzh" Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.929924 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnzzh" event={"ID":"9ee87b32-b776-445a-b3f5-d94c87430742","Type":"ContainerDied","Data":"173e2bac19f98e044889ad8bf0a8574a7f4395a10a08eca65ef4e48092867dd6"} Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.929959 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnzzh" event={"ID":"9ee87b32-b776-445a-b3f5-d94c87430742","Type":"ContainerDied","Data":"4255d4d1a34060f2ad4652af104e6a193491cabc20a1518c7a3fae1ce746d0a1"} Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.929983 4555 scope.go:117] "RemoveContainer" containerID="173e2bac19f98e044889ad8bf0a8574a7f4395a10a08eca65ef4e48092867dd6" Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.946204 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bdb9r" podStartSLOduration=2.067543335 podStartE2EDuration="33.946193032s" podCreationTimestamp="2025-12-09 15:00:31 +0000 UTC" firstStartedPulling="2025-12-09 15:00:32.643119096 +0000 UTC m=+142.779831814" lastFinishedPulling="2025-12-09 15:01:04.521768792 +0000 UTC m=+174.658481511" observedRunningTime="2025-12-09 15:01:04.942268118 +0000 UTC m=+175.078980836" watchObservedRunningTime="2025-12-09 15:01:04.946193032 +0000 UTC m=+175.082905751" Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.946902 4555 scope.go:117] "RemoveContainer" containerID="bf6111fac6cca6f155f479b2c907b1c7b1defd4847fca4fb74b7b04ff01aa821" Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.969731 4555 scope.go:117] "RemoveContainer" containerID="90665757b4fb62635f77fbf1b82155622b338fb1dd704f86e9f132cc8b148804" Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.978358 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nnzzh"] Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.980829 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nnzzh"] Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.985332 4555 scope.go:117] "RemoveContainer" containerID="173e2bac19f98e044889ad8bf0a8574a7f4395a10a08eca65ef4e48092867dd6" Dec 09 15:01:04 crc kubenswrapper[4555]: E1209 15:01:04.985750 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"173e2bac19f98e044889ad8bf0a8574a7f4395a10a08eca65ef4e48092867dd6\": container with ID starting with 173e2bac19f98e044889ad8bf0a8574a7f4395a10a08eca65ef4e48092867dd6 not found: ID does not exist" containerID="173e2bac19f98e044889ad8bf0a8574a7f4395a10a08eca65ef4e48092867dd6" Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.985783 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"173e2bac19f98e044889ad8bf0a8574a7f4395a10a08eca65ef4e48092867dd6"} err="failed to get container status \"173e2bac19f98e044889ad8bf0a8574a7f4395a10a08eca65ef4e48092867dd6\": rpc error: code = NotFound desc = could not find container \"173e2bac19f98e044889ad8bf0a8574a7f4395a10a08eca65ef4e48092867dd6\": container with ID starting with 173e2bac19f98e044889ad8bf0a8574a7f4395a10a08eca65ef4e48092867dd6 not found: ID does not exist" Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.985820 4555 scope.go:117] "RemoveContainer" containerID="bf6111fac6cca6f155f479b2c907b1c7b1defd4847fca4fb74b7b04ff01aa821" Dec 09 15:01:04 crc kubenswrapper[4555]: E1209 15:01:04.986071 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf6111fac6cca6f155f479b2c907b1c7b1defd4847fca4fb74b7b04ff01aa821\": container with ID starting with bf6111fac6cca6f155f479b2c907b1c7b1defd4847fca4fb74b7b04ff01aa821 not found: ID does not exist" containerID="bf6111fac6cca6f155f479b2c907b1c7b1defd4847fca4fb74b7b04ff01aa821" Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.986127 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf6111fac6cca6f155f479b2c907b1c7b1defd4847fca4fb74b7b04ff01aa821"} err="failed to get container status \"bf6111fac6cca6f155f479b2c907b1c7b1defd4847fca4fb74b7b04ff01aa821\": rpc error: code = NotFound desc = could not find container \"bf6111fac6cca6f155f479b2c907b1c7b1defd4847fca4fb74b7b04ff01aa821\": container with ID starting with bf6111fac6cca6f155f479b2c907b1c7b1defd4847fca4fb74b7b04ff01aa821 not found: ID does not exist" Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.986146 4555 scope.go:117] "RemoveContainer" containerID="90665757b4fb62635f77fbf1b82155622b338fb1dd704f86e9f132cc8b148804" Dec 09 15:01:04 crc kubenswrapper[4555]: E1209 15:01:04.986406 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90665757b4fb62635f77fbf1b82155622b338fb1dd704f86e9f132cc8b148804\": container with ID starting with 90665757b4fb62635f77fbf1b82155622b338fb1dd704f86e9f132cc8b148804 not found: ID does not exist" containerID="90665757b4fb62635f77fbf1b82155622b338fb1dd704f86e9f132cc8b148804" Dec 09 15:01:04 crc kubenswrapper[4555]: I1209 15:01:04.986430 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90665757b4fb62635f77fbf1b82155622b338fb1dd704f86e9f132cc8b148804"} err="failed to get container status \"90665757b4fb62635f77fbf1b82155622b338fb1dd704f86e9f132cc8b148804\": rpc error: code = NotFound desc = could not find container \"90665757b4fb62635f77fbf1b82155622b338fb1dd704f86e9f132cc8b148804\": container with ID starting with 90665757b4fb62635f77fbf1b82155622b338fb1dd704f86e9f132cc8b148804 not found: ID does not exist" Dec 09 15:01:05 crc kubenswrapper[4555]: I1209 15:01:05.309659 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vpsml" Dec 09 15:01:05 crc kubenswrapper[4555]: I1209 15:01:05.309689 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vpsml" Dec 09 15:01:05 crc kubenswrapper[4555]: I1209 15:01:05.934818 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dwd6f" event={"ID":"5b3ae0d5-62f7-4219-b006-f6a33d913018","Type":"ContainerStarted","Data":"173087e7be1476e4a92e54c73ed963e224973a215618d17758b9115edad9ad6b"} Dec 09 15:01:05 crc kubenswrapper[4555]: I1209 15:01:05.935888 4555 generic.go:334] "Generic (PLEG): container finished" podID="b86a6402-17c6-4be0-b3ab-7d0bd357f8b2" containerID="9448140ee3785bafd14d6a076ad1f08a2cf23545d8ad3990c282680987bcdb0c" exitCode=0 Dec 09 15:01:05 crc kubenswrapper[4555]: I1209 15:01:05.935934 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwts6" event={"ID":"b86a6402-17c6-4be0-b3ab-7d0bd357f8b2","Type":"ContainerDied","Data":"9448140ee3785bafd14d6a076ad1f08a2cf23545d8ad3990c282680987bcdb0c"} Dec 09 15:01:05 crc kubenswrapper[4555]: I1209 15:01:05.948594 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dwd6f" podStartSLOduration=2.032693837 podStartE2EDuration="31.948582231s" podCreationTimestamp="2025-12-09 15:00:34 +0000 UTC" firstStartedPulling="2025-12-09 15:00:35.698237355 +0000 UTC m=+145.834950073" lastFinishedPulling="2025-12-09 15:01:05.61412575 +0000 UTC m=+175.750838467" observedRunningTime="2025-12-09 15:01:05.94699274 +0000 UTC m=+176.083705458" watchObservedRunningTime="2025-12-09 15:01:05.948582231 +0000 UTC m=+176.085294949" Dec 09 15:01:06 crc kubenswrapper[4555]: I1209 15:01:06.166839 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ee87b32-b776-445a-b3f5-d94c87430742" path="/var/lib/kubelet/pods/9ee87b32-b776-445a-b3f5-d94c87430742/volumes" Dec 09 15:01:06 crc kubenswrapper[4555]: I1209 15:01:06.218738 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wt67p" Dec 09 15:01:06 crc kubenswrapper[4555]: I1209 15:01:06.337994 4555 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vpsml" podUID="99932cc2-84e4-4996-84ab-55407f09fa3a" containerName="registry-server" probeResult="failure" output=< Dec 09 15:01:06 crc kubenswrapper[4555]: timeout: failed to connect service ":50051" within 1s Dec 09 15:01:06 crc kubenswrapper[4555]: > Dec 09 15:01:06 crc kubenswrapper[4555]: I1209 15:01:06.943127 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwts6" event={"ID":"b86a6402-17c6-4be0-b3ab-7d0bd357f8b2","Type":"ContainerStarted","Data":"5c28cd772389ea197cb673becd99f287a66f5a2e43fecb57d58c54d75a92287f"} Dec 09 15:01:06 crc kubenswrapper[4555]: I1209 15:01:06.957345 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kwts6" podStartSLOduration=2.048722437 podStartE2EDuration="35.95733327s" podCreationTimestamp="2025-12-09 15:00:31 +0000 UTC" firstStartedPulling="2025-12-09 15:00:32.649507515 +0000 UTC m=+142.786220234" lastFinishedPulling="2025-12-09 15:01:06.558118349 +0000 UTC m=+176.694831067" observedRunningTime="2025-12-09 15:01:06.956467951 +0000 UTC m=+177.093180669" watchObservedRunningTime="2025-12-09 15:01:06.95733327 +0000 UTC m=+177.094045977" Dec 09 15:01:09 crc kubenswrapper[4555]: I1209 15:01:09.604089 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 09 15:01:09 crc kubenswrapper[4555]: E1209 15:01:09.604442 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ee87b32-b776-445a-b3f5-d94c87430742" containerName="extract-utilities" Dec 09 15:01:09 crc kubenswrapper[4555]: I1209 15:01:09.604453 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ee87b32-b776-445a-b3f5-d94c87430742" containerName="extract-utilities" Dec 09 15:01:09 crc kubenswrapper[4555]: E1209 15:01:09.604464 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ee87b32-b776-445a-b3f5-d94c87430742" containerName="extract-content" Dec 09 15:01:09 crc kubenswrapper[4555]: I1209 15:01:09.604471 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ee87b32-b776-445a-b3f5-d94c87430742" containerName="extract-content" Dec 09 15:01:09 crc kubenswrapper[4555]: E1209 15:01:09.604482 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ee87b32-b776-445a-b3f5-d94c87430742" containerName="registry-server" Dec 09 15:01:09 crc kubenswrapper[4555]: I1209 15:01:09.604488 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ee87b32-b776-445a-b3f5-d94c87430742" containerName="registry-server" Dec 09 15:01:09 crc kubenswrapper[4555]: E1209 15:01:09.604495 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04dd4e81-06ee-4e2c-aab2-8906307ef074" containerName="pruner" Dec 09 15:01:09 crc kubenswrapper[4555]: I1209 15:01:09.604500 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="04dd4e81-06ee-4e2c-aab2-8906307ef074" containerName="pruner" Dec 09 15:01:09 crc kubenswrapper[4555]: E1209 15:01:09.604513 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e0577ca-03c5-4ede-8c4a-190bac74ed06" containerName="pruner" Dec 09 15:01:09 crc kubenswrapper[4555]: I1209 15:01:09.604518 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e0577ca-03c5-4ede-8c4a-190bac74ed06" containerName="pruner" Dec 09 15:01:09 crc kubenswrapper[4555]: I1209 15:01:09.604605 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e0577ca-03c5-4ede-8c4a-190bac74ed06" containerName="pruner" Dec 09 15:01:09 crc kubenswrapper[4555]: I1209 15:01:09.604615 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ee87b32-b776-445a-b3f5-d94c87430742" containerName="registry-server" Dec 09 15:01:09 crc kubenswrapper[4555]: I1209 15:01:09.604622 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="04dd4e81-06ee-4e2c-aab2-8906307ef074" containerName="pruner" Dec 09 15:01:09 crc kubenswrapper[4555]: I1209 15:01:09.604907 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 15:01:09 crc kubenswrapper[4555]: I1209 15:01:09.606571 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 09 15:01:09 crc kubenswrapper[4555]: I1209 15:01:09.609733 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 09 15:01:09 crc kubenswrapper[4555]: I1209 15:01:09.610736 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 09 15:01:09 crc kubenswrapper[4555]: I1209 15:01:09.669071 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33e10b2f-0d15-4c42-8832-d39e22d08950-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"33e10b2f-0d15-4c42-8832-d39e22d08950\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 15:01:09 crc kubenswrapper[4555]: I1209 15:01:09.669116 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33e10b2f-0d15-4c42-8832-d39e22d08950-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"33e10b2f-0d15-4c42-8832-d39e22d08950\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 15:01:09 crc kubenswrapper[4555]: I1209 15:01:09.770132 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33e10b2f-0d15-4c42-8832-d39e22d08950-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"33e10b2f-0d15-4c42-8832-d39e22d08950\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 15:01:09 crc kubenswrapper[4555]: I1209 15:01:09.770265 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33e10b2f-0d15-4c42-8832-d39e22d08950-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"33e10b2f-0d15-4c42-8832-d39e22d08950\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 15:01:09 crc kubenswrapper[4555]: I1209 15:01:09.770352 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33e10b2f-0d15-4c42-8832-d39e22d08950-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"33e10b2f-0d15-4c42-8832-d39e22d08950\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 15:01:09 crc kubenswrapper[4555]: I1209 15:01:09.787262 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33e10b2f-0d15-4c42-8832-d39e22d08950-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"33e10b2f-0d15-4c42-8832-d39e22d08950\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 15:01:09 crc kubenswrapper[4555]: I1209 15:01:09.920270 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 15:01:10 crc kubenswrapper[4555]: I1209 15:01:10.260152 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 09 15:01:10 crc kubenswrapper[4555]: W1209 15:01:10.265474 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod33e10b2f_0d15_4c42_8832_d39e22d08950.slice/crio-3d6d67ed409a6c21eb55ab887ebf58f145c7be7c074ada0d4dabbe3783b875fe WatchSource:0}: Error finding container 3d6d67ed409a6c21eb55ab887ebf58f145c7be7c074ada0d4dabbe3783b875fe: Status 404 returned error can't find the container with id 3d6d67ed409a6c21eb55ab887ebf58f145c7be7c074ada0d4dabbe3783b875fe Dec 09 15:01:10 crc kubenswrapper[4555]: I1209 15:01:10.967220 4555 generic.go:334] "Generic (PLEG): container finished" podID="33e10b2f-0d15-4c42-8832-d39e22d08950" containerID="96ab0767828d3e4d8664ad6c512947ee761c3cd4baf2f6255f1d5f1e9e160db6" exitCode=0 Dec 09 15:01:10 crc kubenswrapper[4555]: I1209 15:01:10.967325 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"33e10b2f-0d15-4c42-8832-d39e22d08950","Type":"ContainerDied","Data":"96ab0767828d3e4d8664ad6c512947ee761c3cd4baf2f6255f1d5f1e9e160db6"} Dec 09 15:01:10 crc kubenswrapper[4555]: I1209 15:01:10.967439 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"33e10b2f-0d15-4c42-8832-d39e22d08950","Type":"ContainerStarted","Data":"3d6d67ed409a6c21eb55ab887ebf58f145c7be7c074ada0d4dabbe3783b875fe"} Dec 09 15:01:11 crc kubenswrapper[4555]: I1209 15:01:11.694563 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bdb9r" Dec 09 15:01:11 crc kubenswrapper[4555]: I1209 15:01:11.694763 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bdb9r" Dec 09 15:01:11 crc kubenswrapper[4555]: I1209 15:01:11.721615 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bdb9r" Dec 09 15:01:11 crc kubenswrapper[4555]: I1209 15:01:11.974058 4555 generic.go:334] "Generic (PLEG): container finished" podID="d3f6bdc8-6545-4114-982c-f452951e5a0b" containerID="1e004b337b056bea23a352f5febbc1c88bb46d90f27d7c02409bb8c419359501" exitCode=0 Dec 09 15:01:11 crc kubenswrapper[4555]: I1209 15:01:11.974121 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqnwk" event={"ID":"d3f6bdc8-6545-4114-982c-f452951e5a0b","Type":"ContainerDied","Data":"1e004b337b056bea23a352f5febbc1c88bb46d90f27d7c02409bb8c419359501"} Dec 09 15:01:12 crc kubenswrapper[4555]: I1209 15:01:12.008685 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bdb9r" Dec 09 15:01:12 crc kubenswrapper[4555]: I1209 15:01:12.223118 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 15:01:12 crc kubenswrapper[4555]: I1209 15:01:12.293579 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33e10b2f-0d15-4c42-8832-d39e22d08950-kubelet-dir\") pod \"33e10b2f-0d15-4c42-8832-d39e22d08950\" (UID: \"33e10b2f-0d15-4c42-8832-d39e22d08950\") " Dec 09 15:01:12 crc kubenswrapper[4555]: I1209 15:01:12.293663 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33e10b2f-0d15-4c42-8832-d39e22d08950-kube-api-access\") pod \"33e10b2f-0d15-4c42-8832-d39e22d08950\" (UID: \"33e10b2f-0d15-4c42-8832-d39e22d08950\") " Dec 09 15:01:12 crc kubenswrapper[4555]: I1209 15:01:12.293683 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33e10b2f-0d15-4c42-8832-d39e22d08950-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "33e10b2f-0d15-4c42-8832-d39e22d08950" (UID: "33e10b2f-0d15-4c42-8832-d39e22d08950"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:01:12 crc kubenswrapper[4555]: I1209 15:01:12.293948 4555 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33e10b2f-0d15-4c42-8832-d39e22d08950-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:12 crc kubenswrapper[4555]: I1209 15:01:12.297618 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33e10b2f-0d15-4c42-8832-d39e22d08950-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "33e10b2f-0d15-4c42-8832-d39e22d08950" (UID: "33e10b2f-0d15-4c42-8832-d39e22d08950"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:01:12 crc kubenswrapper[4555]: I1209 15:01:12.317779 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kwts6" Dec 09 15:01:12 crc kubenswrapper[4555]: I1209 15:01:12.317816 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kwts6" Dec 09 15:01:12 crc kubenswrapper[4555]: I1209 15:01:12.346565 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kwts6" Dec 09 15:01:12 crc kubenswrapper[4555]: I1209 15:01:12.395340 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33e10b2f-0d15-4c42-8832-d39e22d08950-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:12 crc kubenswrapper[4555]: I1209 15:01:12.982289 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqnwk" event={"ID":"d3f6bdc8-6545-4114-982c-f452951e5a0b","Type":"ContainerStarted","Data":"e4b0e3c7a0c2ef463ac4142be48c2f5d11ab55969c3e1715b263081cd6de7b58"} Dec 09 15:01:12 crc kubenswrapper[4555]: I1209 15:01:12.984367 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"33e10b2f-0d15-4c42-8832-d39e22d08950","Type":"ContainerDied","Data":"3d6d67ed409a6c21eb55ab887ebf58f145c7be7c074ada0d4dabbe3783b875fe"} Dec 09 15:01:12 crc kubenswrapper[4555]: I1209 15:01:12.984401 4555 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d6d67ed409a6c21eb55ab887ebf58f145c7be7c074ada0d4dabbe3783b875fe" Dec 09 15:01:12 crc kubenswrapper[4555]: I1209 15:01:12.984455 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 15:01:13 crc kubenswrapper[4555]: I1209 15:01:13.012160 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kwts6" Dec 09 15:01:13 crc kubenswrapper[4555]: I1209 15:01:13.023072 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kqnwk" podStartSLOduration=3.126177924 podStartE2EDuration="40.023059562s" podCreationTimestamp="2025-12-09 15:00:33 +0000 UTC" firstStartedPulling="2025-12-09 15:00:35.718329641 +0000 UTC m=+145.855042359" lastFinishedPulling="2025-12-09 15:01:12.615211279 +0000 UTC m=+182.751923997" observedRunningTime="2025-12-09 15:01:12.996468986 +0000 UTC m=+183.133181704" watchObservedRunningTime="2025-12-09 15:01:13.023059562 +0000 UTC m=+183.159772279" Dec 09 15:01:13 crc kubenswrapper[4555]: I1209 15:01:13.989895 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-v5j6b"] Dec 09 15:01:14 crc kubenswrapper[4555]: I1209 15:01:14.307219 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kqnwk" Dec 09 15:01:14 crc kubenswrapper[4555]: I1209 15:01:14.307266 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kqnwk" Dec 09 15:01:14 crc kubenswrapper[4555]: I1209 15:01:14.334223 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kqnwk" Dec 09 15:01:14 crc kubenswrapper[4555]: I1209 15:01:14.948024 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dwd6f" Dec 09 15:01:14 crc kubenswrapper[4555]: I1209 15:01:14.948075 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dwd6f" Dec 09 15:01:14 crc kubenswrapper[4555]: I1209 15:01:14.974392 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dwd6f" Dec 09 15:01:15 crc kubenswrapper[4555]: I1209 15:01:15.016961 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dwd6f" Dec 09 15:01:15 crc kubenswrapper[4555]: I1209 15:01:15.285875 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:01:15 crc kubenswrapper[4555]: I1209 15:01:15.343639 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vpsml" Dec 09 15:01:15 crc kubenswrapper[4555]: I1209 15:01:15.375993 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vpsml" Dec 09 15:01:15 crc kubenswrapper[4555]: I1209 15:01:15.792246 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kwts6"] Dec 09 15:01:15 crc kubenswrapper[4555]: I1209 15:01:15.792425 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kwts6" podUID="b86a6402-17c6-4be0-b3ab-7d0bd357f8b2" containerName="registry-server" containerID="cri-o://5c28cd772389ea197cb673becd99f287a66f5a2e43fecb57d58c54d75a92287f" gracePeriod=2 Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:15.997601 4555 generic.go:334] "Generic (PLEG): container finished" podID="b86a6402-17c6-4be0-b3ab-7d0bd357f8b2" containerID="5c28cd772389ea197cb673becd99f287a66f5a2e43fecb57d58c54d75a92287f" exitCode=0 Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:15.998356 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwts6" event={"ID":"b86a6402-17c6-4be0-b3ab-7d0bd357f8b2","Type":"ContainerDied","Data":"5c28cd772389ea197cb673becd99f287a66f5a2e43fecb57d58c54d75a92287f"} Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.156654 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kwts6" Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.202271 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 09 15:01:16 crc kubenswrapper[4555]: E1209 15:01:16.202467 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b86a6402-17c6-4be0-b3ab-7d0bd357f8b2" containerName="extract-content" Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.202480 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="b86a6402-17c6-4be0-b3ab-7d0bd357f8b2" containerName="extract-content" Dec 09 15:01:16 crc kubenswrapper[4555]: E1209 15:01:16.202492 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33e10b2f-0d15-4c42-8832-d39e22d08950" containerName="pruner" Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.202498 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="33e10b2f-0d15-4c42-8832-d39e22d08950" containerName="pruner" Dec 09 15:01:16 crc kubenswrapper[4555]: E1209 15:01:16.202505 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b86a6402-17c6-4be0-b3ab-7d0bd357f8b2" containerName="extract-utilities" Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.202511 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="b86a6402-17c6-4be0-b3ab-7d0bd357f8b2" containerName="extract-utilities" Dec 09 15:01:16 crc kubenswrapper[4555]: E1209 15:01:16.202522 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b86a6402-17c6-4be0-b3ab-7d0bd357f8b2" containerName="registry-server" Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.202527 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="b86a6402-17c6-4be0-b3ab-7d0bd357f8b2" containerName="registry-server" Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.202618 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="33e10b2f-0d15-4c42-8832-d39e22d08950" containerName="pruner" Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.202626 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="b86a6402-17c6-4be0-b3ab-7d0bd357f8b2" containerName="registry-server" Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.202932 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.204183 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.204601 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.209449 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.238862 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c6gp\" (UniqueName: \"kubernetes.io/projected/b86a6402-17c6-4be0-b3ab-7d0bd357f8b2-kube-api-access-8c6gp\") pod \"b86a6402-17c6-4be0-b3ab-7d0bd357f8b2\" (UID: \"b86a6402-17c6-4be0-b3ab-7d0bd357f8b2\") " Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.238962 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b86a6402-17c6-4be0-b3ab-7d0bd357f8b2-utilities\") pod \"b86a6402-17c6-4be0-b3ab-7d0bd357f8b2\" (UID: \"b86a6402-17c6-4be0-b3ab-7d0bd357f8b2\") " Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.238989 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b86a6402-17c6-4be0-b3ab-7d0bd357f8b2-catalog-content\") pod \"b86a6402-17c6-4be0-b3ab-7d0bd357f8b2\" (UID: \"b86a6402-17c6-4be0-b3ab-7d0bd357f8b2\") " Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.239152 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbdfa4d2-08c3-40dd-b100-622dceb965cd-kube-api-access\") pod \"installer-9-crc\" (UID: \"bbdfa4d2-08c3-40dd-b100-622dceb965cd\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.239191 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bbdfa4d2-08c3-40dd-b100-622dceb965cd-var-lock\") pod \"installer-9-crc\" (UID: \"bbdfa4d2-08c3-40dd-b100-622dceb965cd\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.239230 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bbdfa4d2-08c3-40dd-b100-622dceb965cd-kubelet-dir\") pod \"installer-9-crc\" (UID: \"bbdfa4d2-08c3-40dd-b100-622dceb965cd\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.239575 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b86a6402-17c6-4be0-b3ab-7d0bd357f8b2-utilities" (OuterVolumeSpecName: "utilities") pod "b86a6402-17c6-4be0-b3ab-7d0bd357f8b2" (UID: "b86a6402-17c6-4be0-b3ab-7d0bd357f8b2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.244392 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b86a6402-17c6-4be0-b3ab-7d0bd357f8b2-kube-api-access-8c6gp" (OuterVolumeSpecName: "kube-api-access-8c6gp") pod "b86a6402-17c6-4be0-b3ab-7d0bd357f8b2" (UID: "b86a6402-17c6-4be0-b3ab-7d0bd357f8b2"). InnerVolumeSpecName "kube-api-access-8c6gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.280138 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b86a6402-17c6-4be0-b3ab-7d0bd357f8b2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b86a6402-17c6-4be0-b3ab-7d0bd357f8b2" (UID: "b86a6402-17c6-4be0-b3ab-7d0bd357f8b2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.340084 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbdfa4d2-08c3-40dd-b100-622dceb965cd-kube-api-access\") pod \"installer-9-crc\" (UID: \"bbdfa4d2-08c3-40dd-b100-622dceb965cd\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.340157 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bbdfa4d2-08c3-40dd-b100-622dceb965cd-var-lock\") pod \"installer-9-crc\" (UID: \"bbdfa4d2-08c3-40dd-b100-622dceb965cd\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.340207 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bbdfa4d2-08c3-40dd-b100-622dceb965cd-kubelet-dir\") pod \"installer-9-crc\" (UID: \"bbdfa4d2-08c3-40dd-b100-622dceb965cd\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.340282 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c6gp\" (UniqueName: \"kubernetes.io/projected/b86a6402-17c6-4be0-b3ab-7d0bd357f8b2-kube-api-access-8c6gp\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.340293 4555 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b86a6402-17c6-4be0-b3ab-7d0bd357f8b2-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.340316 4555 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b86a6402-17c6-4be0-b3ab-7d0bd357f8b2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.340352 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bbdfa4d2-08c3-40dd-b100-622dceb965cd-kubelet-dir\") pod \"installer-9-crc\" (UID: \"bbdfa4d2-08c3-40dd-b100-622dceb965cd\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.340549 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bbdfa4d2-08c3-40dd-b100-622dceb965cd-var-lock\") pod \"installer-9-crc\" (UID: \"bbdfa4d2-08c3-40dd-b100-622dceb965cd\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.352393 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbdfa4d2-08c3-40dd-b100-622dceb965cd-kube-api-access\") pod \"installer-9-crc\" (UID: \"bbdfa4d2-08c3-40dd-b100-622dceb965cd\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.515106 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 09 15:01:16 crc kubenswrapper[4555]: I1209 15:01:16.851350 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 09 15:01:17 crc kubenswrapper[4555]: I1209 15:01:17.008951 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kwts6" Dec 09 15:01:17 crc kubenswrapper[4555]: I1209 15:01:17.008972 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwts6" event={"ID":"b86a6402-17c6-4be0-b3ab-7d0bd357f8b2","Type":"ContainerDied","Data":"73186d0f13e83df9c0000055980b9bb60c0df511ec683aa8744cca68418526ae"} Dec 09 15:01:17 crc kubenswrapper[4555]: I1209 15:01:17.009017 4555 scope.go:117] "RemoveContainer" containerID="5c28cd772389ea197cb673becd99f287a66f5a2e43fecb57d58c54d75a92287f" Dec 09 15:01:17 crc kubenswrapper[4555]: I1209 15:01:17.010070 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"bbdfa4d2-08c3-40dd-b100-622dceb965cd","Type":"ContainerStarted","Data":"c534cf5d61a58d62da6c8c9a2f1096fdf12bac7dbf22c08fa3f2bcc3966a5863"} Dec 09 15:01:17 crc kubenswrapper[4555]: I1209 15:01:17.019074 4555 scope.go:117] "RemoveContainer" containerID="9448140ee3785bafd14d6a076ad1f08a2cf23545d8ad3990c282680987bcdb0c" Dec 09 15:01:17 crc kubenswrapper[4555]: I1209 15:01:17.030397 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kwts6"] Dec 09 15:01:17 crc kubenswrapper[4555]: I1209 15:01:17.035404 4555 scope.go:117] "RemoveContainer" containerID="29cf6f29221600b318f9a76c7097af53c14d428293bfd161b51a1a5a142423d8" Dec 09 15:01:17 crc kubenswrapper[4555]: I1209 15:01:17.041344 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kwts6"] Dec 09 15:01:18 crc kubenswrapper[4555]: I1209 15:01:18.015998 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"bbdfa4d2-08c3-40dd-b100-622dceb965cd","Type":"ContainerStarted","Data":"17e652929c1a43cb321af7b81169d93e40be82e9e58a0be3f8f9139afd089153"} Dec 09 15:01:18 crc kubenswrapper[4555]: I1209 15:01:18.029350 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.029337048 podStartE2EDuration="2.029337048s" podCreationTimestamp="2025-12-09 15:01:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:01:18.025664722 +0000 UTC m=+188.162377440" watchObservedRunningTime="2025-12-09 15:01:18.029337048 +0000 UTC m=+188.166049766" Dec 09 15:01:18 crc kubenswrapper[4555]: I1209 15:01:18.165999 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b86a6402-17c6-4be0-b3ab-7d0bd357f8b2" path="/var/lib/kubelet/pods/b86a6402-17c6-4be0-b3ab-7d0bd357f8b2/volumes" Dec 09 15:01:18 crc kubenswrapper[4555]: I1209 15:01:18.791453 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vpsml"] Dec 09 15:01:18 crc kubenswrapper[4555]: I1209 15:01:18.791647 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vpsml" podUID="99932cc2-84e4-4996-84ab-55407f09fa3a" containerName="registry-server" containerID="cri-o://8776f9f010d5bfb704640cf682bb9dc79a3277b20c5f23d9a9d7a103bf7e2cb0" gracePeriod=2 Dec 09 15:01:19 crc kubenswrapper[4555]: I1209 15:01:19.022261 4555 generic.go:334] "Generic (PLEG): container finished" podID="99932cc2-84e4-4996-84ab-55407f09fa3a" containerID="8776f9f010d5bfb704640cf682bb9dc79a3277b20c5f23d9a9d7a103bf7e2cb0" exitCode=0 Dec 09 15:01:19 crc kubenswrapper[4555]: I1209 15:01:19.022823 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpsml" event={"ID":"99932cc2-84e4-4996-84ab-55407f09fa3a","Type":"ContainerDied","Data":"8776f9f010d5bfb704640cf682bb9dc79a3277b20c5f23d9a9d7a103bf7e2cb0"} Dec 09 15:01:19 crc kubenswrapper[4555]: I1209 15:01:19.151952 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vpsml" Dec 09 15:01:19 crc kubenswrapper[4555]: I1209 15:01:19.165610 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99932cc2-84e4-4996-84ab-55407f09fa3a-catalog-content\") pod \"99932cc2-84e4-4996-84ab-55407f09fa3a\" (UID: \"99932cc2-84e4-4996-84ab-55407f09fa3a\") " Dec 09 15:01:19 crc kubenswrapper[4555]: I1209 15:01:19.165650 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qv5t\" (UniqueName: \"kubernetes.io/projected/99932cc2-84e4-4996-84ab-55407f09fa3a-kube-api-access-6qv5t\") pod \"99932cc2-84e4-4996-84ab-55407f09fa3a\" (UID: \"99932cc2-84e4-4996-84ab-55407f09fa3a\") " Dec 09 15:01:19 crc kubenswrapper[4555]: I1209 15:01:19.165706 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99932cc2-84e4-4996-84ab-55407f09fa3a-utilities\") pod \"99932cc2-84e4-4996-84ab-55407f09fa3a\" (UID: \"99932cc2-84e4-4996-84ab-55407f09fa3a\") " Dec 09 15:01:19 crc kubenswrapper[4555]: I1209 15:01:19.166424 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99932cc2-84e4-4996-84ab-55407f09fa3a-utilities" (OuterVolumeSpecName: "utilities") pod "99932cc2-84e4-4996-84ab-55407f09fa3a" (UID: "99932cc2-84e4-4996-84ab-55407f09fa3a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:01:19 crc kubenswrapper[4555]: I1209 15:01:19.176541 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99932cc2-84e4-4996-84ab-55407f09fa3a-kube-api-access-6qv5t" (OuterVolumeSpecName: "kube-api-access-6qv5t") pod "99932cc2-84e4-4996-84ab-55407f09fa3a" (UID: "99932cc2-84e4-4996-84ab-55407f09fa3a"). InnerVolumeSpecName "kube-api-access-6qv5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:01:19 crc kubenswrapper[4555]: I1209 15:01:19.252374 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99932cc2-84e4-4996-84ab-55407f09fa3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "99932cc2-84e4-4996-84ab-55407f09fa3a" (UID: "99932cc2-84e4-4996-84ab-55407f09fa3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:01:19 crc kubenswrapper[4555]: I1209 15:01:19.267450 4555 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99932cc2-84e4-4996-84ab-55407f09fa3a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:19 crc kubenswrapper[4555]: I1209 15:01:19.267482 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qv5t\" (UniqueName: \"kubernetes.io/projected/99932cc2-84e4-4996-84ab-55407f09fa3a-kube-api-access-6qv5t\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:19 crc kubenswrapper[4555]: I1209 15:01:19.267496 4555 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99932cc2-84e4-4996-84ab-55407f09fa3a-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:20 crc kubenswrapper[4555]: I1209 15:01:20.027971 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpsml" event={"ID":"99932cc2-84e4-4996-84ab-55407f09fa3a","Type":"ContainerDied","Data":"0be65d01bb16c4e697eaf84fc81f3507d7ac12fbd86cf3aadbc9d1558a56d655"} Dec 09 15:01:20 crc kubenswrapper[4555]: I1209 15:01:20.028020 4555 scope.go:117] "RemoveContainer" containerID="8776f9f010d5bfb704640cf682bb9dc79a3277b20c5f23d9a9d7a103bf7e2cb0" Dec 09 15:01:20 crc kubenswrapper[4555]: I1209 15:01:20.028021 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vpsml" Dec 09 15:01:20 crc kubenswrapper[4555]: I1209 15:01:20.041741 4555 scope.go:117] "RemoveContainer" containerID="01b17f78140a90febfd1c80a28c6bec5aa56082e46b0ca339f880856ff1276ff" Dec 09 15:01:20 crc kubenswrapper[4555]: I1209 15:01:20.052870 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vpsml"] Dec 09 15:01:20 crc kubenswrapper[4555]: I1209 15:01:20.058093 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vpsml"] Dec 09 15:01:20 crc kubenswrapper[4555]: I1209 15:01:20.077511 4555 scope.go:117] "RemoveContainer" containerID="88fa7e5250a60cc294d46717a601f6896e0bd90c9d9ad4ae09c96a77d53d0ad4" Dec 09 15:01:20 crc kubenswrapper[4555]: I1209 15:01:20.165845 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99932cc2-84e4-4996-84ab-55407f09fa3a" path="/var/lib/kubelet/pods/99932cc2-84e4-4996-84ab-55407f09fa3a/volumes" Dec 09 15:01:24 crc kubenswrapper[4555]: I1209 15:01:24.334513 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kqnwk" Dec 09 15:01:25 crc kubenswrapper[4555]: I1209 15:01:25.531768 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:01:25 crc kubenswrapper[4555]: I1209 15:01:25.531822 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:01:27 crc kubenswrapper[4555]: I1209 15:01:27.591488 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqnwk"] Dec 09 15:01:27 crc kubenswrapper[4555]: I1209 15:01:27.591683 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kqnwk" podUID="d3f6bdc8-6545-4114-982c-f452951e5a0b" containerName="registry-server" containerID="cri-o://e4b0e3c7a0c2ef463ac4142be48c2f5d11ab55969c3e1715b263081cd6de7b58" gracePeriod=2 Dec 09 15:01:27 crc kubenswrapper[4555]: I1209 15:01:27.909098 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqnwk" Dec 09 15:01:27 crc kubenswrapper[4555]: I1209 15:01:27.947977 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3f6bdc8-6545-4114-982c-f452951e5a0b-catalog-content\") pod \"d3f6bdc8-6545-4114-982c-f452951e5a0b\" (UID: \"d3f6bdc8-6545-4114-982c-f452951e5a0b\") " Dec 09 15:01:27 crc kubenswrapper[4555]: I1209 15:01:27.948067 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkbrb\" (UniqueName: \"kubernetes.io/projected/d3f6bdc8-6545-4114-982c-f452951e5a0b-kube-api-access-gkbrb\") pod \"d3f6bdc8-6545-4114-982c-f452951e5a0b\" (UID: \"d3f6bdc8-6545-4114-982c-f452951e5a0b\") " Dec 09 15:01:27 crc kubenswrapper[4555]: I1209 15:01:27.948103 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3f6bdc8-6545-4114-982c-f452951e5a0b-utilities\") pod \"d3f6bdc8-6545-4114-982c-f452951e5a0b\" (UID: \"d3f6bdc8-6545-4114-982c-f452951e5a0b\") " Dec 09 15:01:27 crc kubenswrapper[4555]: I1209 15:01:27.948724 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3f6bdc8-6545-4114-982c-f452951e5a0b-utilities" (OuterVolumeSpecName: "utilities") pod "d3f6bdc8-6545-4114-982c-f452951e5a0b" (UID: "d3f6bdc8-6545-4114-982c-f452951e5a0b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:01:27 crc kubenswrapper[4555]: I1209 15:01:27.952104 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3f6bdc8-6545-4114-982c-f452951e5a0b-kube-api-access-gkbrb" (OuterVolumeSpecName: "kube-api-access-gkbrb") pod "d3f6bdc8-6545-4114-982c-f452951e5a0b" (UID: "d3f6bdc8-6545-4114-982c-f452951e5a0b"). InnerVolumeSpecName "kube-api-access-gkbrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:01:27 crc kubenswrapper[4555]: I1209 15:01:27.963805 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3f6bdc8-6545-4114-982c-f452951e5a0b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3f6bdc8-6545-4114-982c-f452951e5a0b" (UID: "d3f6bdc8-6545-4114-982c-f452951e5a0b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:01:28 crc kubenswrapper[4555]: I1209 15:01:28.049213 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkbrb\" (UniqueName: \"kubernetes.io/projected/d3f6bdc8-6545-4114-982c-f452951e5a0b-kube-api-access-gkbrb\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:28 crc kubenswrapper[4555]: I1209 15:01:28.049264 4555 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3f6bdc8-6545-4114-982c-f452951e5a0b-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:28 crc kubenswrapper[4555]: I1209 15:01:28.049274 4555 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3f6bdc8-6545-4114-982c-f452951e5a0b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:28 crc kubenswrapper[4555]: I1209 15:01:28.065404 4555 generic.go:334] "Generic (PLEG): container finished" podID="d3f6bdc8-6545-4114-982c-f452951e5a0b" containerID="e4b0e3c7a0c2ef463ac4142be48c2f5d11ab55969c3e1715b263081cd6de7b58" exitCode=0 Dec 09 15:01:28 crc kubenswrapper[4555]: I1209 15:01:28.065442 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqnwk" event={"ID":"d3f6bdc8-6545-4114-982c-f452951e5a0b","Type":"ContainerDied","Data":"e4b0e3c7a0c2ef463ac4142be48c2f5d11ab55969c3e1715b263081cd6de7b58"} Dec 09 15:01:28 crc kubenswrapper[4555]: I1209 15:01:28.065455 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqnwk" Dec 09 15:01:28 crc kubenswrapper[4555]: I1209 15:01:28.065467 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqnwk" event={"ID":"d3f6bdc8-6545-4114-982c-f452951e5a0b","Type":"ContainerDied","Data":"322aab1fca241fedf8c3bf6f45d018cdb762a33f0a63eac547bdc66c15c0ce8a"} Dec 09 15:01:28 crc kubenswrapper[4555]: I1209 15:01:28.065485 4555 scope.go:117] "RemoveContainer" containerID="e4b0e3c7a0c2ef463ac4142be48c2f5d11ab55969c3e1715b263081cd6de7b58" Dec 09 15:01:28 crc kubenswrapper[4555]: I1209 15:01:28.077651 4555 scope.go:117] "RemoveContainer" containerID="1e004b337b056bea23a352f5febbc1c88bb46d90f27d7c02409bb8c419359501" Dec 09 15:01:28 crc kubenswrapper[4555]: I1209 15:01:28.083058 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqnwk"] Dec 09 15:01:28 crc kubenswrapper[4555]: I1209 15:01:28.086072 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqnwk"] Dec 09 15:01:28 crc kubenswrapper[4555]: I1209 15:01:28.108796 4555 scope.go:117] "RemoveContainer" containerID="7ba931d6701dd687698d85c9c68091f05fe060614907d61978a91d24d3047ed8" Dec 09 15:01:28 crc kubenswrapper[4555]: I1209 15:01:28.118544 4555 scope.go:117] "RemoveContainer" containerID="e4b0e3c7a0c2ef463ac4142be48c2f5d11ab55969c3e1715b263081cd6de7b58" Dec 09 15:01:28 crc kubenswrapper[4555]: E1209 15:01:28.118884 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4b0e3c7a0c2ef463ac4142be48c2f5d11ab55969c3e1715b263081cd6de7b58\": container with ID starting with e4b0e3c7a0c2ef463ac4142be48c2f5d11ab55969c3e1715b263081cd6de7b58 not found: ID does not exist" containerID="e4b0e3c7a0c2ef463ac4142be48c2f5d11ab55969c3e1715b263081cd6de7b58" Dec 09 15:01:28 crc kubenswrapper[4555]: I1209 15:01:28.118926 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4b0e3c7a0c2ef463ac4142be48c2f5d11ab55969c3e1715b263081cd6de7b58"} err="failed to get container status \"e4b0e3c7a0c2ef463ac4142be48c2f5d11ab55969c3e1715b263081cd6de7b58\": rpc error: code = NotFound desc = could not find container \"e4b0e3c7a0c2ef463ac4142be48c2f5d11ab55969c3e1715b263081cd6de7b58\": container with ID starting with e4b0e3c7a0c2ef463ac4142be48c2f5d11ab55969c3e1715b263081cd6de7b58 not found: ID does not exist" Dec 09 15:01:28 crc kubenswrapper[4555]: I1209 15:01:28.118953 4555 scope.go:117] "RemoveContainer" containerID="1e004b337b056bea23a352f5febbc1c88bb46d90f27d7c02409bb8c419359501" Dec 09 15:01:28 crc kubenswrapper[4555]: E1209 15:01:28.119207 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e004b337b056bea23a352f5febbc1c88bb46d90f27d7c02409bb8c419359501\": container with ID starting with 1e004b337b056bea23a352f5febbc1c88bb46d90f27d7c02409bb8c419359501 not found: ID does not exist" containerID="1e004b337b056bea23a352f5febbc1c88bb46d90f27d7c02409bb8c419359501" Dec 09 15:01:28 crc kubenswrapper[4555]: I1209 15:01:28.119235 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e004b337b056bea23a352f5febbc1c88bb46d90f27d7c02409bb8c419359501"} err="failed to get container status \"1e004b337b056bea23a352f5febbc1c88bb46d90f27d7c02409bb8c419359501\": rpc error: code = NotFound desc = could not find container \"1e004b337b056bea23a352f5febbc1c88bb46d90f27d7c02409bb8c419359501\": container with ID starting with 1e004b337b056bea23a352f5febbc1c88bb46d90f27d7c02409bb8c419359501 not found: ID does not exist" Dec 09 15:01:28 crc kubenswrapper[4555]: I1209 15:01:28.119253 4555 scope.go:117] "RemoveContainer" containerID="7ba931d6701dd687698d85c9c68091f05fe060614907d61978a91d24d3047ed8" Dec 09 15:01:28 crc kubenswrapper[4555]: E1209 15:01:28.119456 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ba931d6701dd687698d85c9c68091f05fe060614907d61978a91d24d3047ed8\": container with ID starting with 7ba931d6701dd687698d85c9c68091f05fe060614907d61978a91d24d3047ed8 not found: ID does not exist" containerID="7ba931d6701dd687698d85c9c68091f05fe060614907d61978a91d24d3047ed8" Dec 09 15:01:28 crc kubenswrapper[4555]: I1209 15:01:28.119476 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ba931d6701dd687698d85c9c68091f05fe060614907d61978a91d24d3047ed8"} err="failed to get container status \"7ba931d6701dd687698d85c9c68091f05fe060614907d61978a91d24d3047ed8\": rpc error: code = NotFound desc = could not find container \"7ba931d6701dd687698d85c9c68091f05fe060614907d61978a91d24d3047ed8\": container with ID starting with 7ba931d6701dd687698d85c9c68091f05fe060614907d61978a91d24d3047ed8 not found: ID does not exist" Dec 09 15:01:28 crc kubenswrapper[4555]: I1209 15:01:28.165612 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3f6bdc8-6545-4114-982c-f452951e5a0b" path="/var/lib/kubelet/pods/d3f6bdc8-6545-4114-982c-f452951e5a0b/volumes" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.017781 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" podUID="89fdb2fe-9563-4934-82c8-10c218d81bea" containerName="oauth-openshift" containerID="cri-o://91d9afced0c772ad319a8a18e08508bc11e29dff86ec3c7b921e59b3582aa84d" gracePeriod=15 Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.299516 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.318869 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-644868f-msgvb"] Dec 09 15:01:39 crc kubenswrapper[4555]: E1209 15:01:39.319043 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99932cc2-84e4-4996-84ab-55407f09fa3a" containerName="extract-utilities" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.319054 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="99932cc2-84e4-4996-84ab-55407f09fa3a" containerName="extract-utilities" Dec 09 15:01:39 crc kubenswrapper[4555]: E1209 15:01:39.319063 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3f6bdc8-6545-4114-982c-f452951e5a0b" containerName="extract-utilities" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.319069 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f6bdc8-6545-4114-982c-f452951e5a0b" containerName="extract-utilities" Dec 09 15:01:39 crc kubenswrapper[4555]: E1209 15:01:39.319080 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99932cc2-84e4-4996-84ab-55407f09fa3a" containerName="registry-server" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.319086 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="99932cc2-84e4-4996-84ab-55407f09fa3a" containerName="registry-server" Dec 09 15:01:39 crc kubenswrapper[4555]: E1209 15:01:39.319095 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3f6bdc8-6545-4114-982c-f452951e5a0b" containerName="registry-server" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.319100 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f6bdc8-6545-4114-982c-f452951e5a0b" containerName="registry-server" Dec 09 15:01:39 crc kubenswrapper[4555]: E1209 15:01:39.319109 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3f6bdc8-6545-4114-982c-f452951e5a0b" containerName="extract-content" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.319114 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f6bdc8-6545-4114-982c-f452951e5a0b" containerName="extract-content" Dec 09 15:01:39 crc kubenswrapper[4555]: E1209 15:01:39.319120 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99932cc2-84e4-4996-84ab-55407f09fa3a" containerName="extract-content" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.319125 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="99932cc2-84e4-4996-84ab-55407f09fa3a" containerName="extract-content" Dec 09 15:01:39 crc kubenswrapper[4555]: E1209 15:01:39.319136 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89fdb2fe-9563-4934-82c8-10c218d81bea" containerName="oauth-openshift" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.319141 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="89fdb2fe-9563-4934-82c8-10c218d81bea" containerName="oauth-openshift" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.319219 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="89fdb2fe-9563-4934-82c8-10c218d81bea" containerName="oauth-openshift" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.319230 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="99932cc2-84e4-4996-84ab-55407f09fa3a" containerName="registry-server" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.319236 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3f6bdc8-6545-4114-982c-f452951e5a0b" containerName="registry-server" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.319545 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.329016 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-644868f-msgvb"] Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.359441 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-user-template-login\") pod \"89fdb2fe-9563-4934-82c8-10c218d81bea\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.359481 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/89fdb2fe-9563-4934-82c8-10c218d81bea-audit-policies\") pod \"89fdb2fe-9563-4934-82c8-10c218d81bea\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.359504 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-serving-cert\") pod \"89fdb2fe-9563-4934-82c8-10c218d81bea\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.359525 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/89fdb2fe-9563-4934-82c8-10c218d81bea-audit-dir\") pod \"89fdb2fe-9563-4934-82c8-10c218d81bea\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.359542 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-user-idp-0-file-data\") pod \"89fdb2fe-9563-4934-82c8-10c218d81bea\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.359561 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrqkx\" (UniqueName: \"kubernetes.io/projected/89fdb2fe-9563-4934-82c8-10c218d81bea-kube-api-access-xrqkx\") pod \"89fdb2fe-9563-4934-82c8-10c218d81bea\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.359584 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-service-ca\") pod \"89fdb2fe-9563-4934-82c8-10c218d81bea\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.359602 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-user-template-error\") pod \"89fdb2fe-9563-4934-82c8-10c218d81bea\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.359617 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-cliconfig\") pod \"89fdb2fe-9563-4934-82c8-10c218d81bea\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.359633 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-trusted-ca-bundle\") pod \"89fdb2fe-9563-4934-82c8-10c218d81bea\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.359648 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-router-certs\") pod \"89fdb2fe-9563-4934-82c8-10c218d81bea\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.359666 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-ocp-branding-template\") pod \"89fdb2fe-9563-4934-82c8-10c218d81bea\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.359699 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-user-template-provider-selection\") pod \"89fdb2fe-9563-4934-82c8-10c218d81bea\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.359728 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-session\") pod \"89fdb2fe-9563-4934-82c8-10c218d81bea\" (UID: \"89fdb2fe-9563-4934-82c8-10c218d81bea\") " Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.359801 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-system-service-ca\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.359822 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.359839 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-user-template-error\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.359857 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-audit-policies\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.359890 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-user-template-login\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.359904 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.359928 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.359942 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wpmn\" (UniqueName: \"kubernetes.io/projected/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-kube-api-access-7wpmn\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.359959 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-system-cliconfig\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.359974 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-audit-dir\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.359988 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-system-serving-cert\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.360003 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-system-session\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.360018 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.360042 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-system-router-certs\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.360206 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89fdb2fe-9563-4934-82c8-10c218d81bea-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "89fdb2fe-9563-4934-82c8-10c218d81bea" (UID: "89fdb2fe-9563-4934-82c8-10c218d81bea"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.360635 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "89fdb2fe-9563-4934-82c8-10c218d81bea" (UID: "89fdb2fe-9563-4934-82c8-10c218d81bea"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.360712 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "89fdb2fe-9563-4934-82c8-10c218d81bea" (UID: "89fdb2fe-9563-4934-82c8-10c218d81bea"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.361024 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "89fdb2fe-9563-4934-82c8-10c218d81bea" (UID: "89fdb2fe-9563-4934-82c8-10c218d81bea"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.361688 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89fdb2fe-9563-4934-82c8-10c218d81bea-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "89fdb2fe-9563-4934-82c8-10c218d81bea" (UID: "89fdb2fe-9563-4934-82c8-10c218d81bea"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.364037 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "89fdb2fe-9563-4934-82c8-10c218d81bea" (UID: "89fdb2fe-9563-4934-82c8-10c218d81bea"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.364441 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "89fdb2fe-9563-4934-82c8-10c218d81bea" (UID: "89fdb2fe-9563-4934-82c8-10c218d81bea"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.364562 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89fdb2fe-9563-4934-82c8-10c218d81bea-kube-api-access-xrqkx" (OuterVolumeSpecName: "kube-api-access-xrqkx") pod "89fdb2fe-9563-4934-82c8-10c218d81bea" (UID: "89fdb2fe-9563-4934-82c8-10c218d81bea"). InnerVolumeSpecName "kube-api-access-xrqkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.364616 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "89fdb2fe-9563-4934-82c8-10c218d81bea" (UID: "89fdb2fe-9563-4934-82c8-10c218d81bea"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.364777 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "89fdb2fe-9563-4934-82c8-10c218d81bea" (UID: "89fdb2fe-9563-4934-82c8-10c218d81bea"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.365018 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "89fdb2fe-9563-4934-82c8-10c218d81bea" (UID: "89fdb2fe-9563-4934-82c8-10c218d81bea"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.365127 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "89fdb2fe-9563-4934-82c8-10c218d81bea" (UID: "89fdb2fe-9563-4934-82c8-10c218d81bea"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.365263 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "89fdb2fe-9563-4934-82c8-10c218d81bea" (UID: "89fdb2fe-9563-4934-82c8-10c218d81bea"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.365482 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "89fdb2fe-9563-4934-82c8-10c218d81bea" (UID: "89fdb2fe-9563-4934-82c8-10c218d81bea"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.460784 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.460837 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wpmn\" (UniqueName: \"kubernetes.io/projected/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-kube-api-access-7wpmn\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.460860 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-system-cliconfig\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.460876 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-audit-dir\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.460894 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-system-session\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.460907 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-system-serving-cert\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.460922 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.460955 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-system-router-certs\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.460970 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-system-service-ca\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.460986 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.461003 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-user-template-error\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.461023 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-audit-policies\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.461065 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-user-template-login\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.461078 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.461121 4555 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.461132 4555 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.461142 4555 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.461150 4555 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/89fdb2fe-9563-4934-82c8-10c218d81bea-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.461160 4555 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.461169 4555 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/89fdb2fe-9563-4934-82c8-10c218d81bea-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.461177 4555 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.461185 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrqkx\" (UniqueName: \"kubernetes.io/projected/89fdb2fe-9563-4934-82c8-10c218d81bea-kube-api-access-xrqkx\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.461193 4555 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.461203 4555 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.461211 4555 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.461220 4555 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.461228 4555 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.461238 4555 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/89fdb2fe-9563-4934-82c8-10c218d81bea-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.461425 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-audit-dir\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.461567 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.461751 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-audit-policies\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.462138 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-system-service-ca\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.462142 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-system-cliconfig\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.463501 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-user-template-error\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.463566 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.464085 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-system-serving-cert\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.464108 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-system-router-certs\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.463886 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-system-session\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.464610 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.464947 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-user-template-login\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.465180 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.473335 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wpmn\" (UniqueName: \"kubernetes.io/projected/86acd50f-8cc6-4d2b-9127-501d1eacfbfd-kube-api-access-7wpmn\") pod \"oauth-openshift-644868f-msgvb\" (UID: \"86acd50f-8cc6-4d2b-9127-501d1eacfbfd\") " pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.634213 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:39 crc kubenswrapper[4555]: I1209 15:01:39.953580 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-644868f-msgvb"] Dec 09 15:01:39 crc kubenswrapper[4555]: W1209 15:01:39.957272 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86acd50f_8cc6_4d2b_9127_501d1eacfbfd.slice/crio-219315e98a176a6a376fb5477cad9adfc1311fe4bdea55465a55d8a020eb4a23 WatchSource:0}: Error finding container 219315e98a176a6a376fb5477cad9adfc1311fe4bdea55465a55d8a020eb4a23: Status 404 returned error can't find the container with id 219315e98a176a6a376fb5477cad9adfc1311fe4bdea55465a55d8a020eb4a23 Dec 09 15:01:40 crc kubenswrapper[4555]: I1209 15:01:40.112870 4555 generic.go:334] "Generic (PLEG): container finished" podID="89fdb2fe-9563-4934-82c8-10c218d81bea" containerID="91d9afced0c772ad319a8a18e08508bc11e29dff86ec3c7b921e59b3582aa84d" exitCode=0 Dec 09 15:01:40 crc kubenswrapper[4555]: I1209 15:01:40.112909 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" event={"ID":"89fdb2fe-9563-4934-82c8-10c218d81bea","Type":"ContainerDied","Data":"91d9afced0c772ad319a8a18e08508bc11e29dff86ec3c7b921e59b3582aa84d"} Dec 09 15:01:40 crc kubenswrapper[4555]: I1209 15:01:40.113084 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" event={"ID":"89fdb2fe-9563-4934-82c8-10c218d81bea","Type":"ContainerDied","Data":"9b6a9c7b7dcd4eb25a2c8d2d0bbd99100cf80721e10db28a61e1e1e38b26ad45"} Dec 09 15:01:40 crc kubenswrapper[4555]: I1209 15:01:40.113103 4555 scope.go:117] "RemoveContainer" containerID="91d9afced0c772ad319a8a18e08508bc11e29dff86ec3c7b921e59b3582aa84d" Dec 09 15:01:40 crc kubenswrapper[4555]: I1209 15:01:40.112926 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-v5j6b" Dec 09 15:01:40 crc kubenswrapper[4555]: I1209 15:01:40.115967 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-644868f-msgvb" event={"ID":"86acd50f-8cc6-4d2b-9127-501d1eacfbfd","Type":"ContainerStarted","Data":"b1845e200e1f2c13ac2490b868a24b63967b96a6bdc4af43c5d842f829403f77"} Dec 09 15:01:40 crc kubenswrapper[4555]: I1209 15:01:40.115997 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-644868f-msgvb" event={"ID":"86acd50f-8cc6-4d2b-9127-501d1eacfbfd","Type":"ContainerStarted","Data":"219315e98a176a6a376fb5477cad9adfc1311fe4bdea55465a55d8a020eb4a23"} Dec 09 15:01:40 crc kubenswrapper[4555]: I1209 15:01:40.116166 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:40 crc kubenswrapper[4555]: I1209 15:01:40.118741 4555 patch_prober.go:28] interesting pod/oauth-openshift-644868f-msgvb container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.56:6443/healthz\": dial tcp 10.217.0.56:6443: connect: connection refused" start-of-body= Dec 09 15:01:40 crc kubenswrapper[4555]: I1209 15:01:40.118782 4555 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-644868f-msgvb" podUID="86acd50f-8cc6-4d2b-9127-501d1eacfbfd" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.56:6443/healthz\": dial tcp 10.217.0.56:6443: connect: connection refused" Dec 09 15:01:40 crc kubenswrapper[4555]: I1209 15:01:40.126616 4555 scope.go:117] "RemoveContainer" containerID="91d9afced0c772ad319a8a18e08508bc11e29dff86ec3c7b921e59b3582aa84d" Dec 09 15:01:40 crc kubenswrapper[4555]: E1209 15:01:40.126921 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91d9afced0c772ad319a8a18e08508bc11e29dff86ec3c7b921e59b3582aa84d\": container with ID starting with 91d9afced0c772ad319a8a18e08508bc11e29dff86ec3c7b921e59b3582aa84d not found: ID does not exist" containerID="91d9afced0c772ad319a8a18e08508bc11e29dff86ec3c7b921e59b3582aa84d" Dec 09 15:01:40 crc kubenswrapper[4555]: I1209 15:01:40.126977 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91d9afced0c772ad319a8a18e08508bc11e29dff86ec3c7b921e59b3582aa84d"} err="failed to get container status \"91d9afced0c772ad319a8a18e08508bc11e29dff86ec3c7b921e59b3582aa84d\": rpc error: code = NotFound desc = could not find container \"91d9afced0c772ad319a8a18e08508bc11e29dff86ec3c7b921e59b3582aa84d\": container with ID starting with 91d9afced0c772ad319a8a18e08508bc11e29dff86ec3c7b921e59b3582aa84d not found: ID does not exist" Dec 09 15:01:40 crc kubenswrapper[4555]: I1209 15:01:40.130932 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-644868f-msgvb" podStartSLOduration=26.130921262 podStartE2EDuration="26.130921262s" podCreationTimestamp="2025-12-09 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:01:40.127945388 +0000 UTC m=+210.264658106" watchObservedRunningTime="2025-12-09 15:01:40.130921262 +0000 UTC m=+210.267633979" Dec 09 15:01:40 crc kubenswrapper[4555]: I1209 15:01:40.162686 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-v5j6b"] Dec 09 15:01:40 crc kubenswrapper[4555]: I1209 15:01:40.176234 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-v5j6b"] Dec 09 15:01:41 crc kubenswrapper[4555]: I1209 15:01:41.123183 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-644868f-msgvb" Dec 09 15:01:42 crc kubenswrapper[4555]: I1209 15:01:42.166096 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89fdb2fe-9563-4934-82c8-10c218d81bea" path="/var/lib/kubelet/pods/89fdb2fe-9563-4934-82c8-10c218d81bea/volumes" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.597508 4555 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.598458 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.598763 4555 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.598790 4555 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 09 15:01:54 crc kubenswrapper[4555]: E1209 15:01:54.598905 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.598920 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 09 15:01:54 crc kubenswrapper[4555]: E1209 15:01:54.598928 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.598933 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 09 15:01:54 crc kubenswrapper[4555]: E1209 15:01:54.598943 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.598950 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 09 15:01:54 crc kubenswrapper[4555]: E1209 15:01:54.598957 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.598962 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 09 15:01:54 crc kubenswrapper[4555]: E1209 15:01:54.598969 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.598975 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 09 15:01:54 crc kubenswrapper[4555]: E1209 15:01:54.598982 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.598988 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.599060 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.599079 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.599084 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.599093 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.599102 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.599597 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42" gracePeriod=15 Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.599624 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27" gracePeriod=15 Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.599654 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b" gracePeriod=15 Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.599707 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738" gracePeriod=15 Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.599730 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed" gracePeriod=15 Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.631095 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.701913 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.701954 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.702017 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.702035 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.702060 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.702087 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.702193 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.702240 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.802930 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.802979 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.803002 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.803025 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.803043 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.803060 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.803077 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.803091 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.803125 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.803144 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.803155 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.803095 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.803155 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.803166 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.803328 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.803362 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.926704 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:01:54 crc kubenswrapper[4555]: W1209 15:01:54.940213 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-b6750a43d88fa8dda6cc123027e2c6a206aab404b5c451ed581ec75f5fbf4535 WatchSource:0}: Error finding container b6750a43d88fa8dda6cc123027e2c6a206aab404b5c451ed581ec75f5fbf4535: Status 404 returned error can't find the container with id b6750a43d88fa8dda6cc123027e2c6a206aab404b5c451ed581ec75f5fbf4535 Dec 09 15:01:54 crc kubenswrapper[4555]: E1209 15:01:54.942516 4555 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.26.32:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187f94307dc617d1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-09 15:01:54.941999057 +0000 UTC m=+225.078711775,LastTimestamp:2025-12-09 15:01:54.941999057 +0000 UTC m=+225.078711775,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 09 15:01:54 crc kubenswrapper[4555]: E1209 15:01:54.981526 4555 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:01:54 crc kubenswrapper[4555]: E1209 15:01:54.981939 4555 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:01:54 crc kubenswrapper[4555]: E1209 15:01:54.982248 4555 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:01:54 crc kubenswrapper[4555]: E1209 15:01:54.982437 4555 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:01:54 crc kubenswrapper[4555]: E1209 15:01:54.982606 4555 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:01:54 crc kubenswrapper[4555]: I1209 15:01:54.982634 4555 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 09 15:01:54 crc kubenswrapper[4555]: E1209 15:01:54.982868 4555 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.32:6443: connect: connection refused" interval="200ms" Dec 09 15:01:55 crc kubenswrapper[4555]: I1209 15:01:55.147385 4555 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 09 15:01:55 crc kubenswrapper[4555]: I1209 15:01:55.147425 4555 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 09 15:01:55 crc kubenswrapper[4555]: I1209 15:01:55.173111 4555 generic.go:334] "Generic (PLEG): container finished" podID="bbdfa4d2-08c3-40dd-b100-622dceb965cd" containerID="17e652929c1a43cb321af7b81169d93e40be82e9e58a0be3f8f9139afd089153" exitCode=0 Dec 09 15:01:55 crc kubenswrapper[4555]: I1209 15:01:55.173191 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"bbdfa4d2-08c3-40dd-b100-622dceb965cd","Type":"ContainerDied","Data":"17e652929c1a43cb321af7b81169d93e40be82e9e58a0be3f8f9139afd089153"} Dec 09 15:01:55 crc kubenswrapper[4555]: I1209 15:01:55.173644 4555 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:01:55 crc kubenswrapper[4555]: I1209 15:01:55.173852 4555 status_manager.go:851] "Failed to get status for pod" podUID="bbdfa4d2-08c3-40dd-b100-622dceb965cd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:01:55 crc kubenswrapper[4555]: I1209 15:01:55.174089 4555 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:01:55 crc kubenswrapper[4555]: I1209 15:01:55.175925 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 09 15:01:55 crc kubenswrapper[4555]: I1209 15:01:55.176523 4555 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b" exitCode=0 Dec 09 15:01:55 crc kubenswrapper[4555]: I1209 15:01:55.176543 4555 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27" exitCode=0 Dec 09 15:01:55 crc kubenswrapper[4555]: I1209 15:01:55.176552 4555 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738" exitCode=0 Dec 09 15:01:55 crc kubenswrapper[4555]: I1209 15:01:55.176562 4555 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed" exitCode=2 Dec 09 15:01:55 crc kubenswrapper[4555]: I1209 15:01:55.177648 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"a9f634a95c76b3a1a725d945590ee6d5166f8b1711fe56be36eb555c46b354bd"} Dec 09 15:01:55 crc kubenswrapper[4555]: I1209 15:01:55.177674 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"b6750a43d88fa8dda6cc123027e2c6a206aab404b5c451ed581ec75f5fbf4535"} Dec 09 15:01:55 crc kubenswrapper[4555]: I1209 15:01:55.177942 4555 status_manager.go:851] "Failed to get status for pod" podUID="bbdfa4d2-08c3-40dd-b100-622dceb965cd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:01:55 crc kubenswrapper[4555]: I1209 15:01:55.178140 4555 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:01:55 crc kubenswrapper[4555]: I1209 15:01:55.178355 4555 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:01:55 crc kubenswrapper[4555]: E1209 15:01:55.183641 4555 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.32:6443: connect: connection refused" interval="400ms" Dec 09 15:01:55 crc kubenswrapper[4555]: I1209 15:01:55.531556 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:01:55 crc kubenswrapper[4555]: I1209 15:01:55.531615 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:01:55 crc kubenswrapper[4555]: I1209 15:01:55.531659 4555 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" Dec 09 15:01:55 crc kubenswrapper[4555]: I1209 15:01:55.532036 4555 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa"} pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 15:01:55 crc kubenswrapper[4555]: I1209 15:01:55.532103 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" containerID="cri-o://0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa" gracePeriod=600 Dec 09 15:01:55 crc kubenswrapper[4555]: E1209 15:01:55.584384 4555 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.32:6443: connect: connection refused" interval="800ms" Dec 09 15:01:56 crc kubenswrapper[4555]: I1209 15:01:56.181826 4555 generic.go:334] "Generic (PLEG): container finished" podID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerID="0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa" exitCode=0 Dec 09 15:01:56 crc kubenswrapper[4555]: I1209 15:01:56.182202 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" event={"ID":"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36","Type":"ContainerDied","Data":"0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa"} Dec 09 15:01:56 crc kubenswrapper[4555]: I1209 15:01:56.182228 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" event={"ID":"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36","Type":"ContainerStarted","Data":"180dd851cd6ac7ce99763c65c8a9c37630bce1147ad5777bcc34e7fd32e66125"} Dec 09 15:01:56 crc kubenswrapper[4555]: I1209 15:01:56.183061 4555 status_manager.go:851] "Failed to get status for pod" podUID="bbdfa4d2-08c3-40dd-b100-622dceb965cd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:01:56 crc kubenswrapper[4555]: I1209 15:01:56.183392 4555 status_manager.go:851] "Failed to get status for pod" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-x6p6q\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:01:56 crc kubenswrapper[4555]: I1209 15:01:56.183686 4555 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:01:56 crc kubenswrapper[4555]: I1209 15:01:56.348847 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 09 15:01:56 crc kubenswrapper[4555]: I1209 15:01:56.349560 4555 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:01:56 crc kubenswrapper[4555]: I1209 15:01:56.350279 4555 status_manager.go:851] "Failed to get status for pod" podUID="bbdfa4d2-08c3-40dd-b100-622dceb965cd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:01:56 crc kubenswrapper[4555]: I1209 15:01:56.350661 4555 status_manager.go:851] "Failed to get status for pod" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-x6p6q\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:01:56 crc kubenswrapper[4555]: E1209 15:01:56.385241 4555 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.32:6443: connect: connection refused" interval="1.6s" Dec 09 15:01:56 crc kubenswrapper[4555]: I1209 15:01:56.420162 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbdfa4d2-08c3-40dd-b100-622dceb965cd-kube-api-access\") pod \"bbdfa4d2-08c3-40dd-b100-622dceb965cd\" (UID: \"bbdfa4d2-08c3-40dd-b100-622dceb965cd\") " Dec 09 15:01:56 crc kubenswrapper[4555]: I1209 15:01:56.420219 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bbdfa4d2-08c3-40dd-b100-622dceb965cd-var-lock\") pod \"bbdfa4d2-08c3-40dd-b100-622dceb965cd\" (UID: \"bbdfa4d2-08c3-40dd-b100-622dceb965cd\") " Dec 09 15:01:56 crc kubenswrapper[4555]: I1209 15:01:56.420241 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bbdfa4d2-08c3-40dd-b100-622dceb965cd-kubelet-dir\") pod \"bbdfa4d2-08c3-40dd-b100-622dceb965cd\" (UID: \"bbdfa4d2-08c3-40dd-b100-622dceb965cd\") " Dec 09 15:01:56 crc kubenswrapper[4555]: I1209 15:01:56.420294 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bbdfa4d2-08c3-40dd-b100-622dceb965cd-var-lock" (OuterVolumeSpecName: "var-lock") pod "bbdfa4d2-08c3-40dd-b100-622dceb965cd" (UID: "bbdfa4d2-08c3-40dd-b100-622dceb965cd"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:01:56 crc kubenswrapper[4555]: I1209 15:01:56.420351 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bbdfa4d2-08c3-40dd-b100-622dceb965cd-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bbdfa4d2-08c3-40dd-b100-622dceb965cd" (UID: "bbdfa4d2-08c3-40dd-b100-622dceb965cd"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:01:56 crc kubenswrapper[4555]: I1209 15:01:56.420440 4555 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bbdfa4d2-08c3-40dd-b100-622dceb965cd-var-lock\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:56 crc kubenswrapper[4555]: I1209 15:01:56.420451 4555 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bbdfa4d2-08c3-40dd-b100-622dceb965cd-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:56 crc kubenswrapper[4555]: I1209 15:01:56.424314 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbdfa4d2-08c3-40dd-b100-622dceb965cd-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bbdfa4d2-08c3-40dd-b100-622dceb965cd" (UID: "bbdfa4d2-08c3-40dd-b100-622dceb965cd"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:01:56 crc kubenswrapper[4555]: I1209 15:01:56.522018 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbdfa4d2-08c3-40dd-b100-622dceb965cd-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:56 crc kubenswrapper[4555]: I1209 15:01:56.912568 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 09 15:01:56 crc kubenswrapper[4555]: I1209 15:01:56.913411 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:01:56 crc kubenswrapper[4555]: I1209 15:01:56.913885 4555 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:01:56 crc kubenswrapper[4555]: I1209 15:01:56.914460 4555 status_manager.go:851] "Failed to get status for pod" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-x6p6q\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:01:56 crc kubenswrapper[4555]: I1209 15:01:56.914711 4555 status_manager.go:851] "Failed to get status for pod" podUID="bbdfa4d2-08c3-40dd-b100-622dceb965cd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:01:56 crc kubenswrapper[4555]: I1209 15:01:56.914929 4555 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.027492 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.027578 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.027608 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.027646 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.027673 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.027782 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.027952 4555 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.027969 4555 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.027978 4555 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.186768 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.186774 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"bbdfa4d2-08c3-40dd-b100-622dceb965cd","Type":"ContainerDied","Data":"c534cf5d61a58d62da6c8c9a2f1096fdf12bac7dbf22c08fa3f2bcc3966a5863"} Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.186832 4555 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c534cf5d61a58d62da6c8c9a2f1096fdf12bac7dbf22c08fa3f2bcc3966a5863" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.189140 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.189782 4555 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42" exitCode=0 Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.189821 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.189832 4555 scope.go:117] "RemoveContainer" containerID="c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.198485 4555 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.198873 4555 status_manager.go:851] "Failed to get status for pod" podUID="bbdfa4d2-08c3-40dd-b100-622dceb965cd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.199196 4555 status_manager.go:851] "Failed to get status for pod" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-x6p6q\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.199488 4555 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.200671 4555 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.201003 4555 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.201248 4555 status_manager.go:851] "Failed to get status for pod" podUID="bbdfa4d2-08c3-40dd-b100-622dceb965cd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.201495 4555 status_manager.go:851] "Failed to get status for pod" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-x6p6q\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.202961 4555 scope.go:117] "RemoveContainer" containerID="7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.214272 4555 scope.go:117] "RemoveContainer" containerID="26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.222342 4555 scope.go:117] "RemoveContainer" containerID="787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.230128 4555 scope.go:117] "RemoveContainer" containerID="cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.238402 4555 scope.go:117] "RemoveContainer" containerID="67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.252203 4555 scope.go:117] "RemoveContainer" containerID="c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b" Dec 09 15:01:57 crc kubenswrapper[4555]: E1209 15:01:57.252478 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\": container with ID starting with c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b not found: ID does not exist" containerID="c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.252507 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b"} err="failed to get container status \"c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\": rpc error: code = NotFound desc = could not find container \"c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b\": container with ID starting with c591bf964c364ed5dfb0b30ae43f7fd456c549f701b0f1b8f7ab9ca7b8648f1b not found: ID does not exist" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.252524 4555 scope.go:117] "RemoveContainer" containerID="7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27" Dec 09 15:01:57 crc kubenswrapper[4555]: E1209 15:01:57.252789 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\": container with ID starting with 7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27 not found: ID does not exist" containerID="7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.252849 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27"} err="failed to get container status \"7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\": rpc error: code = NotFound desc = could not find container \"7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27\": container with ID starting with 7803e19de1c3c096dfb29cd40b66a640cf8fe695b24eaf0ecfef38e88feefc27 not found: ID does not exist" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.252862 4555 scope.go:117] "RemoveContainer" containerID="26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738" Dec 09 15:01:57 crc kubenswrapper[4555]: E1209 15:01:57.253326 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\": container with ID starting with 26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738 not found: ID does not exist" containerID="26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.253347 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738"} err="failed to get container status \"26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\": rpc error: code = NotFound desc = could not find container \"26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738\": container with ID starting with 26cd3af11c101edd63da0124a2190c0ee60408cf12e241de0b5743aff5732738 not found: ID does not exist" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.253360 4555 scope.go:117] "RemoveContainer" containerID="787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed" Dec 09 15:01:57 crc kubenswrapper[4555]: E1209 15:01:57.253671 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\": container with ID starting with 787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed not found: ID does not exist" containerID="787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.253692 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed"} err="failed to get container status \"787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\": rpc error: code = NotFound desc = could not find container \"787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed\": container with ID starting with 787954aa7f87026339e7b167ba10c60c81f91c4d0f234fc508684f929edda0ed not found: ID does not exist" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.253705 4555 scope.go:117] "RemoveContainer" containerID="cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42" Dec 09 15:01:57 crc kubenswrapper[4555]: E1209 15:01:57.253882 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\": container with ID starting with cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42 not found: ID does not exist" containerID="cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.253903 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42"} err="failed to get container status \"cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\": rpc error: code = NotFound desc = could not find container \"cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42\": container with ID starting with cd53bcc6affff59de2add405be1496710b4488d3794d358533ddaeecb1c50c42 not found: ID does not exist" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.253914 4555 scope.go:117] "RemoveContainer" containerID="67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d" Dec 09 15:01:57 crc kubenswrapper[4555]: E1209 15:01:57.254094 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\": container with ID starting with 67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d not found: ID does not exist" containerID="67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d" Dec 09 15:01:57 crc kubenswrapper[4555]: I1209 15:01:57.254125 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d"} err="failed to get container status \"67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\": rpc error: code = NotFound desc = could not find container \"67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d\": container with ID starting with 67d4ab80204bce05d2bd6dea0e99f3624f9f38a69ddc198c88d253b22321fd3d not found: ID does not exist" Dec 09 15:01:57 crc kubenswrapper[4555]: E1209 15:01:57.985777 4555 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.32:6443: connect: connection refused" interval="3.2s" Dec 09 15:01:58 crc kubenswrapper[4555]: I1209 15:01:58.165865 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 09 15:02:00 crc kubenswrapper[4555]: I1209 15:02:00.162900 4555 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:02:00 crc kubenswrapper[4555]: I1209 15:02:00.163358 4555 status_manager.go:851] "Failed to get status for pod" podUID="bbdfa4d2-08c3-40dd-b100-622dceb965cd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:02:00 crc kubenswrapper[4555]: I1209 15:02:00.163585 4555 status_manager.go:851] "Failed to get status for pod" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-x6p6q\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:02:00 crc kubenswrapper[4555]: E1209 15:02:00.913571 4555 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.26.32:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187f94307dc617d1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-09 15:01:54.941999057 +0000 UTC m=+225.078711775,LastTimestamp:2025-12-09 15:01:54.941999057 +0000 UTC m=+225.078711775,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 09 15:02:01 crc kubenswrapper[4555]: E1209 15:02:01.186200 4555 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.32:6443: connect: connection refused" interval="6.4s" Dec 09 15:02:04 crc kubenswrapper[4555]: E1209 15:02:04.667595 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T15:02:04Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T15:02:04Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T15:02:04Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T15:02:04Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:02:04 crc kubenswrapper[4555]: E1209 15:02:04.669010 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:02:04 crc kubenswrapper[4555]: E1209 15:02:04.669266 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:02:04 crc kubenswrapper[4555]: E1209 15:02:04.669583 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:02:04 crc kubenswrapper[4555]: E1209 15:02:04.669773 4555 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:02:04 crc kubenswrapper[4555]: E1209 15:02:04.669792 4555 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 15:02:07 crc kubenswrapper[4555]: E1209 15:02:07.586787 4555 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.32:6443: connect: connection refused" interval="7s" Dec 09 15:02:08 crc kubenswrapper[4555]: I1209 15:02:08.160897 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:08 crc kubenswrapper[4555]: I1209 15:02:08.161621 4555 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:02:08 crc kubenswrapper[4555]: I1209 15:02:08.162401 4555 status_manager.go:851] "Failed to get status for pod" podUID="bbdfa4d2-08c3-40dd-b100-622dceb965cd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:02:08 crc kubenswrapper[4555]: I1209 15:02:08.162722 4555 status_manager.go:851] "Failed to get status for pod" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-x6p6q\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:02:08 crc kubenswrapper[4555]: I1209 15:02:08.171056 4555 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="031bef68-de1b-47ae-92d9-8f3ab57baecb" Dec 09 15:02:08 crc kubenswrapper[4555]: I1209 15:02:08.171077 4555 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="031bef68-de1b-47ae-92d9-8f3ab57baecb" Dec 09 15:02:08 crc kubenswrapper[4555]: E1209 15:02:08.171350 4555 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:08 crc kubenswrapper[4555]: I1209 15:02:08.171662 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:08 crc kubenswrapper[4555]: W1209 15:02:08.185197 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-e0423bc05fbe0a9f9a67560fcf1f44c15951ec7a4dd72a2794d077141925699a WatchSource:0}: Error finding container e0423bc05fbe0a9f9a67560fcf1f44c15951ec7a4dd72a2794d077141925699a: Status 404 returned error can't find the container with id e0423bc05fbe0a9f9a67560fcf1f44c15951ec7a4dd72a2794d077141925699a Dec 09 15:02:08 crc kubenswrapper[4555]: I1209 15:02:08.230574 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e0423bc05fbe0a9f9a67560fcf1f44c15951ec7a4dd72a2794d077141925699a"} Dec 09 15:02:09 crc kubenswrapper[4555]: I1209 15:02:09.237296 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 09 15:02:09 crc kubenswrapper[4555]: I1209 15:02:09.237515 4555 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f" exitCode=1 Dec 09 15:02:09 crc kubenswrapper[4555]: I1209 15:02:09.237571 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f"} Dec 09 15:02:09 crc kubenswrapper[4555]: I1209 15:02:09.237852 4555 scope.go:117] "RemoveContainer" containerID="376e853c04e3ccafef162872380c7bf1dc0301c04c70c5428b965ca18989088f" Dec 09 15:02:09 crc kubenswrapper[4555]: I1209 15:02:09.238342 4555 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:02:09 crc kubenswrapper[4555]: I1209 15:02:09.238767 4555 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:02:09 crc kubenswrapper[4555]: I1209 15:02:09.238995 4555 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="f55dc382037b5df59233ee1873076bf832ef7daf32b74bff7f53ebaec936b8c9" exitCode=0 Dec 09 15:02:09 crc kubenswrapper[4555]: I1209 15:02:09.239007 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"f55dc382037b5df59233ee1873076bf832ef7daf32b74bff7f53ebaec936b8c9"} Dec 09 15:02:09 crc kubenswrapper[4555]: I1209 15:02:09.239253 4555 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="031bef68-de1b-47ae-92d9-8f3ab57baecb" Dec 09 15:02:09 crc kubenswrapper[4555]: I1209 15:02:09.239440 4555 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="031bef68-de1b-47ae-92d9-8f3ab57baecb" Dec 09 15:02:09 crc kubenswrapper[4555]: I1209 15:02:09.239570 4555 status_manager.go:851] "Failed to get status for pod" podUID="bbdfa4d2-08c3-40dd-b100-622dceb965cd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:02:09 crc kubenswrapper[4555]: E1209 15:02:09.239740 4555 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:09 crc kubenswrapper[4555]: I1209 15:02:09.240143 4555 status_manager.go:851] "Failed to get status for pod" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-x6p6q\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:02:09 crc kubenswrapper[4555]: I1209 15:02:09.240443 4555 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:02:09 crc kubenswrapper[4555]: I1209 15:02:09.240704 4555 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:02:09 crc kubenswrapper[4555]: I1209 15:02:09.240951 4555 status_manager.go:851] "Failed to get status for pod" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-x6p6q\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:02:09 crc kubenswrapper[4555]: I1209 15:02:09.241187 4555 status_manager.go:851] "Failed to get status for pod" podUID="bbdfa4d2-08c3-40dd-b100-622dceb965cd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.32:6443: connect: connection refused" Dec 09 15:02:10 crc kubenswrapper[4555]: I1209 15:02:10.256699 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9846c867ed4c6073e5edcd1af1898f9cfb141168fd792975d0b5221cb72a5d36"} Dec 09 15:02:10 crc kubenswrapper[4555]: I1209 15:02:10.256924 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:10 crc kubenswrapper[4555]: I1209 15:02:10.256935 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fc8633cdd1fbb41da25bc4a0ad6aea5eb132162a1aecd83a7890cf02478e76ec"} Dec 09 15:02:10 crc kubenswrapper[4555]: I1209 15:02:10.256945 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7f7b5954d6c27b19f9bf2b4ac68cd5837b7ece95b2981376d5e88c480408a556"} Dec 09 15:02:10 crc kubenswrapper[4555]: I1209 15:02:10.256953 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b2713a898d9a321ef551bcd7bd782795b309bc4f7bfb0146e80f5c30d28944d7"} Dec 09 15:02:10 crc kubenswrapper[4555]: I1209 15:02:10.256960 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"08bc25c04708351b36c2a7217ecee944d90c739d49fdb316abc6c8e2862d4261"} Dec 09 15:02:10 crc kubenswrapper[4555]: I1209 15:02:10.256863 4555 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="031bef68-de1b-47ae-92d9-8f3ab57baecb" Dec 09 15:02:10 crc kubenswrapper[4555]: I1209 15:02:10.256978 4555 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="031bef68-de1b-47ae-92d9-8f3ab57baecb" Dec 09 15:02:10 crc kubenswrapper[4555]: I1209 15:02:10.258890 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 09 15:02:10 crc kubenswrapper[4555]: I1209 15:02:10.258930 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dacaf1d04cad2ade41cf37c74e995b03d08a14e2072dd4982f85471c07662a33"} Dec 09 15:02:13 crc kubenswrapper[4555]: I1209 15:02:13.172742 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:13 crc kubenswrapper[4555]: I1209 15:02:13.172971 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:13 crc kubenswrapper[4555]: I1209 15:02:13.176233 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:13 crc kubenswrapper[4555]: I1209 15:02:13.888202 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 15:02:13 crc kubenswrapper[4555]: I1209 15:02:13.891016 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 15:02:14 crc kubenswrapper[4555]: I1209 15:02:14.274157 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 15:02:15 crc kubenswrapper[4555]: I1209 15:02:15.406576 4555 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:15 crc kubenswrapper[4555]: I1209 15:02:15.408255 4555 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"031bef68-de1b-47ae-92d9-8f3ab57baecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T15:02:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T15:02:09Z\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T15:02:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-cert-syncer kube-apiserver-cert-regeneration-controller kube-apiserver-insecure-readyz kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T15:02:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-cert-syncer kube-apiserver-cert-regeneration-controller kube-apiserver-insecure-readyz kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}}}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f55dc382037b5df59233ee1873076bf832ef7daf32b74bff7f53ebaec936b8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f55dc382037b5df59233ee1873076bf832ef7daf32b74bff7f53ebaec936b8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T15:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T15:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Pending\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": pods \"kube-apiserver-crc\" not found" Dec 09 15:02:15 crc kubenswrapper[4555]: I1209 15:02:15.454126 4555 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="29911b33-ac73-4e04-83a4-4f499b8777bc" Dec 09 15:02:16 crc kubenswrapper[4555]: I1209 15:02:16.280834 4555 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="031bef68-de1b-47ae-92d9-8f3ab57baecb" Dec 09 15:02:16 crc kubenswrapper[4555]: I1209 15:02:16.280862 4555 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="031bef68-de1b-47ae-92d9-8f3ab57baecb" Dec 09 15:02:16 crc kubenswrapper[4555]: I1209 15:02:16.282859 4555 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="29911b33-ac73-4e04-83a4-4f499b8777bc" Dec 09 15:02:16 crc kubenswrapper[4555]: I1209 15:02:16.284161 4555 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://08bc25c04708351b36c2a7217ecee944d90c739d49fdb316abc6c8e2862d4261" Dec 09 15:02:16 crc kubenswrapper[4555]: I1209 15:02:16.284179 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:17 crc kubenswrapper[4555]: I1209 15:02:17.284458 4555 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="031bef68-de1b-47ae-92d9-8f3ab57baecb" Dec 09 15:02:17 crc kubenswrapper[4555]: I1209 15:02:17.284599 4555 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="031bef68-de1b-47ae-92d9-8f3ab57baecb" Dec 09 15:02:17 crc kubenswrapper[4555]: I1209 15:02:17.286085 4555 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="29911b33-ac73-4e04-83a4-4f499b8777bc" Dec 09 15:02:25 crc kubenswrapper[4555]: I1209 15:02:25.396407 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 09 15:02:25 crc kubenswrapper[4555]: I1209 15:02:25.666525 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 09 15:02:25 crc kubenswrapper[4555]: I1209 15:02:25.696283 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 09 15:02:25 crc kubenswrapper[4555]: I1209 15:02:25.775208 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 09 15:02:26 crc kubenswrapper[4555]: I1209 15:02:26.252703 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 09 15:02:26 crc kubenswrapper[4555]: I1209 15:02:26.537087 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 09 15:02:26 crc kubenswrapper[4555]: I1209 15:02:26.710036 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 09 15:02:26 crc kubenswrapper[4555]: I1209 15:02:26.854202 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 09 15:02:26 crc kubenswrapper[4555]: I1209 15:02:26.974250 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 09 15:02:27 crc kubenswrapper[4555]: I1209 15:02:27.017875 4555 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 09 15:02:27 crc kubenswrapper[4555]: I1209 15:02:27.108230 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 09 15:02:27 crc kubenswrapper[4555]: I1209 15:02:27.164428 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 09 15:02:27 crc kubenswrapper[4555]: I1209 15:02:27.252977 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 09 15:02:27 crc kubenswrapper[4555]: I1209 15:02:27.446127 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 09 15:02:27 crc kubenswrapper[4555]: I1209 15:02:27.550376 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 15:02:27 crc kubenswrapper[4555]: I1209 15:02:27.552870 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 09 15:02:27 crc kubenswrapper[4555]: I1209 15:02:27.631161 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 09 15:02:27 crc kubenswrapper[4555]: I1209 15:02:27.758062 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 09 15:02:27 crc kubenswrapper[4555]: I1209 15:02:27.832075 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 09 15:02:27 crc kubenswrapper[4555]: I1209 15:02:27.878965 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 09 15:02:27 crc kubenswrapper[4555]: I1209 15:02:27.914577 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 09 15:02:28 crc kubenswrapper[4555]: I1209 15:02:28.029078 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 09 15:02:28 crc kubenswrapper[4555]: I1209 15:02:28.277883 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 09 15:02:28 crc kubenswrapper[4555]: I1209 15:02:28.332491 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 09 15:02:28 crc kubenswrapper[4555]: I1209 15:02:28.401320 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 09 15:02:28 crc kubenswrapper[4555]: I1209 15:02:28.769804 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 09 15:02:28 crc kubenswrapper[4555]: I1209 15:02:28.832437 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 09 15:02:28 crc kubenswrapper[4555]: I1209 15:02:28.939766 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 09 15:02:29 crc kubenswrapper[4555]: I1209 15:02:29.002024 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 09 15:02:29 crc kubenswrapper[4555]: I1209 15:02:29.143056 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 09 15:02:29 crc kubenswrapper[4555]: I1209 15:02:29.157092 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 09 15:02:29 crc kubenswrapper[4555]: I1209 15:02:29.229647 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 09 15:02:29 crc kubenswrapper[4555]: I1209 15:02:29.290782 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 09 15:02:29 crc kubenswrapper[4555]: I1209 15:02:29.296827 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 09 15:02:29 crc kubenswrapper[4555]: I1209 15:02:29.297723 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 09 15:02:29 crc kubenswrapper[4555]: I1209 15:02:29.448458 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 09 15:02:29 crc kubenswrapper[4555]: I1209 15:02:29.517683 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 09 15:02:29 crc kubenswrapper[4555]: I1209 15:02:29.542337 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 09 15:02:29 crc kubenswrapper[4555]: I1209 15:02:29.543141 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 09 15:02:29 crc kubenswrapper[4555]: I1209 15:02:29.596205 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 09 15:02:29 crc kubenswrapper[4555]: I1209 15:02:29.665575 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 09 15:02:29 crc kubenswrapper[4555]: I1209 15:02:29.671638 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 09 15:02:29 crc kubenswrapper[4555]: I1209 15:02:29.721687 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 09 15:02:29 crc kubenswrapper[4555]: I1209 15:02:29.766882 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 09 15:02:29 crc kubenswrapper[4555]: I1209 15:02:29.877386 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 09 15:02:29 crc kubenswrapper[4555]: I1209 15:02:29.889908 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 09 15:02:29 crc kubenswrapper[4555]: I1209 15:02:29.900766 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 09 15:02:29 crc kubenswrapper[4555]: I1209 15:02:29.960657 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 09 15:02:29 crc kubenswrapper[4555]: I1209 15:02:29.966588 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 09 15:02:29 crc kubenswrapper[4555]: I1209 15:02:29.986948 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 09 15:02:30 crc kubenswrapper[4555]: I1209 15:02:30.090575 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 09 15:02:30 crc kubenswrapper[4555]: I1209 15:02:30.122194 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 09 15:02:30 crc kubenswrapper[4555]: I1209 15:02:30.199716 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 09 15:02:30 crc kubenswrapper[4555]: I1209 15:02:30.476401 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 09 15:02:30 crc kubenswrapper[4555]: I1209 15:02:30.507266 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 09 15:02:30 crc kubenswrapper[4555]: I1209 15:02:30.559626 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 09 15:02:30 crc kubenswrapper[4555]: I1209 15:02:30.641278 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 09 15:02:30 crc kubenswrapper[4555]: I1209 15:02:30.751127 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 09 15:02:30 crc kubenswrapper[4555]: I1209 15:02:30.837537 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 09 15:02:30 crc kubenswrapper[4555]: I1209 15:02:30.847773 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 09 15:02:30 crc kubenswrapper[4555]: I1209 15:02:30.899409 4555 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 09 15:02:30 crc kubenswrapper[4555]: I1209 15:02:30.900755 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=36.900742384 podStartE2EDuration="36.900742384s" podCreationTimestamp="2025-12-09 15:01:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:02:15.432593885 +0000 UTC m=+245.569306604" watchObservedRunningTime="2025-12-09 15:02:30.900742384 +0000 UTC m=+261.037455102" Dec 09 15:02:30 crc kubenswrapper[4555]: I1209 15:02:30.902733 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 09 15:02:30 crc kubenswrapper[4555]: I1209 15:02:30.902774 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 09 15:02:30 crc kubenswrapper[4555]: I1209 15:02:30.903027 4555 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="031bef68-de1b-47ae-92d9-8f3ab57baecb" Dec 09 15:02:30 crc kubenswrapper[4555]: I1209 15:02:30.903047 4555 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="031bef68-de1b-47ae-92d9-8f3ab57baecb" Dec 09 15:02:30 crc kubenswrapper[4555]: I1209 15:02:30.905941 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:30 crc kubenswrapper[4555]: I1209 15:02:30.911312 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 09 15:02:30 crc kubenswrapper[4555]: I1209 15:02:30.917194 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=15.917184758 podStartE2EDuration="15.917184758s" podCreationTimestamp="2025-12-09 15:02:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:02:30.914949149 +0000 UTC m=+261.051661867" watchObservedRunningTime="2025-12-09 15:02:30.917184758 +0000 UTC m=+261.053897477" Dec 09 15:02:30 crc kubenswrapper[4555]: I1209 15:02:30.936460 4555 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 09 15:02:31 crc kubenswrapper[4555]: I1209 15:02:31.019963 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 09 15:02:31 crc kubenswrapper[4555]: I1209 15:02:31.116178 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 09 15:02:31 crc kubenswrapper[4555]: I1209 15:02:31.123004 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 09 15:02:31 crc kubenswrapper[4555]: I1209 15:02:31.344397 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 09 15:02:31 crc kubenswrapper[4555]: I1209 15:02:31.436819 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 09 15:02:31 crc kubenswrapper[4555]: I1209 15:02:31.527073 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 09 15:02:31 crc kubenswrapper[4555]: I1209 15:02:31.548542 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 09 15:02:31 crc kubenswrapper[4555]: I1209 15:02:31.608878 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 09 15:02:31 crc kubenswrapper[4555]: I1209 15:02:31.632276 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 09 15:02:31 crc kubenswrapper[4555]: I1209 15:02:31.691066 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 09 15:02:31 crc kubenswrapper[4555]: I1209 15:02:31.716615 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 09 15:02:31 crc kubenswrapper[4555]: I1209 15:02:31.724729 4555 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 09 15:02:31 crc kubenswrapper[4555]: I1209 15:02:31.835749 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 09 15:02:32 crc kubenswrapper[4555]: I1209 15:02:32.029027 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 09 15:02:32 crc kubenswrapper[4555]: I1209 15:02:32.115778 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 09 15:02:32 crc kubenswrapper[4555]: I1209 15:02:32.203213 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 09 15:02:32 crc kubenswrapper[4555]: I1209 15:02:32.204554 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 09 15:02:32 crc kubenswrapper[4555]: I1209 15:02:32.220376 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 09 15:02:32 crc kubenswrapper[4555]: I1209 15:02:32.281450 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 09 15:02:32 crc kubenswrapper[4555]: I1209 15:02:32.298810 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 09 15:02:32 crc kubenswrapper[4555]: I1209 15:02:32.398977 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 09 15:02:32 crc kubenswrapper[4555]: I1209 15:02:32.408490 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 09 15:02:32 crc kubenswrapper[4555]: I1209 15:02:32.473419 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 09 15:02:32 crc kubenswrapper[4555]: I1209 15:02:32.523656 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 09 15:02:32 crc kubenswrapper[4555]: I1209 15:02:32.545723 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 09 15:02:32 crc kubenswrapper[4555]: I1209 15:02:32.569205 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 09 15:02:32 crc kubenswrapper[4555]: I1209 15:02:32.614143 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 09 15:02:32 crc kubenswrapper[4555]: I1209 15:02:32.847140 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 09 15:02:32 crc kubenswrapper[4555]: I1209 15:02:32.859098 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 09 15:02:32 crc kubenswrapper[4555]: I1209 15:02:32.879454 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 09 15:02:32 crc kubenswrapper[4555]: I1209 15:02:32.977263 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 09 15:02:33 crc kubenswrapper[4555]: I1209 15:02:33.137652 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 09 15:02:33 crc kubenswrapper[4555]: I1209 15:02:33.156726 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 09 15:02:33 crc kubenswrapper[4555]: I1209 15:02:33.230077 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 09 15:02:33 crc kubenswrapper[4555]: I1209 15:02:33.281341 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 09 15:02:33 crc kubenswrapper[4555]: I1209 15:02:33.303853 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 09 15:02:33 crc kubenswrapper[4555]: I1209 15:02:33.527061 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 09 15:02:33 crc kubenswrapper[4555]: I1209 15:02:33.592116 4555 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 09 15:02:33 crc kubenswrapper[4555]: I1209 15:02:33.641596 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 09 15:02:33 crc kubenswrapper[4555]: I1209 15:02:33.742811 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 09 15:02:33 crc kubenswrapper[4555]: I1209 15:02:33.744796 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 09 15:02:33 crc kubenswrapper[4555]: I1209 15:02:33.770778 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 09 15:02:33 crc kubenswrapper[4555]: I1209 15:02:33.789338 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 09 15:02:33 crc kubenswrapper[4555]: I1209 15:02:33.863449 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 09 15:02:33 crc kubenswrapper[4555]: I1209 15:02:33.866445 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 09 15:02:33 crc kubenswrapper[4555]: I1209 15:02:33.935734 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 09 15:02:33 crc kubenswrapper[4555]: I1209 15:02:33.938747 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 09 15:02:33 crc kubenswrapper[4555]: I1209 15:02:33.939679 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 09 15:02:34 crc kubenswrapper[4555]: I1209 15:02:34.039222 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 09 15:02:34 crc kubenswrapper[4555]: I1209 15:02:34.098721 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 09 15:02:34 crc kubenswrapper[4555]: I1209 15:02:34.105870 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 09 15:02:34 crc kubenswrapper[4555]: I1209 15:02:34.118979 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 09 15:02:34 crc kubenswrapper[4555]: I1209 15:02:34.159253 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 09 15:02:34 crc kubenswrapper[4555]: I1209 15:02:34.236560 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 09 15:02:34 crc kubenswrapper[4555]: I1209 15:02:34.254792 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 09 15:02:34 crc kubenswrapper[4555]: I1209 15:02:34.301551 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 09 15:02:34 crc kubenswrapper[4555]: I1209 15:02:34.330704 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 09 15:02:34 crc kubenswrapper[4555]: I1209 15:02:34.345282 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 09 15:02:34 crc kubenswrapper[4555]: I1209 15:02:34.447907 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 09 15:02:34 crc kubenswrapper[4555]: I1209 15:02:34.515099 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 09 15:02:34 crc kubenswrapper[4555]: I1209 15:02:34.527784 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 09 15:02:34 crc kubenswrapper[4555]: I1209 15:02:34.549765 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 09 15:02:34 crc kubenswrapper[4555]: I1209 15:02:34.673095 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 09 15:02:34 crc kubenswrapper[4555]: I1209 15:02:34.689229 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 09 15:02:34 crc kubenswrapper[4555]: I1209 15:02:34.725005 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 09 15:02:34 crc kubenswrapper[4555]: I1209 15:02:34.727785 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 09 15:02:34 crc kubenswrapper[4555]: I1209 15:02:34.736414 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 09 15:02:34 crc kubenswrapper[4555]: I1209 15:02:34.765354 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 09 15:02:34 crc kubenswrapper[4555]: I1209 15:02:34.770050 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 09 15:02:34 crc kubenswrapper[4555]: I1209 15:02:34.825009 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 09 15:02:34 crc kubenswrapper[4555]: I1209 15:02:34.990759 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 09 15:02:35 crc kubenswrapper[4555]: I1209 15:02:35.024665 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 09 15:02:35 crc kubenswrapper[4555]: I1209 15:02:35.056078 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 09 15:02:35 crc kubenswrapper[4555]: I1209 15:02:35.074960 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 09 15:02:35 crc kubenswrapper[4555]: I1209 15:02:35.095026 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 09 15:02:35 crc kubenswrapper[4555]: I1209 15:02:35.255860 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 09 15:02:35 crc kubenswrapper[4555]: I1209 15:02:35.257467 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 09 15:02:35 crc kubenswrapper[4555]: I1209 15:02:35.291459 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 09 15:02:35 crc kubenswrapper[4555]: I1209 15:02:35.302746 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 09 15:02:35 crc kubenswrapper[4555]: I1209 15:02:35.307566 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 09 15:02:35 crc kubenswrapper[4555]: I1209 15:02:35.423402 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 09 15:02:35 crc kubenswrapper[4555]: I1209 15:02:35.507066 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 09 15:02:35 crc kubenswrapper[4555]: I1209 15:02:35.558251 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 09 15:02:35 crc kubenswrapper[4555]: I1209 15:02:35.751774 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 09 15:02:35 crc kubenswrapper[4555]: I1209 15:02:35.793866 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 09 15:02:35 crc kubenswrapper[4555]: I1209 15:02:35.854199 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 09 15:02:35 crc kubenswrapper[4555]: I1209 15:02:35.915715 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 09 15:02:35 crc kubenswrapper[4555]: I1209 15:02:35.965784 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 09 15:02:35 crc kubenswrapper[4555]: I1209 15:02:35.981841 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 09 15:02:36 crc kubenswrapper[4555]: I1209 15:02:36.061492 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 09 15:02:36 crc kubenswrapper[4555]: I1209 15:02:36.115894 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 09 15:02:36 crc kubenswrapper[4555]: I1209 15:02:36.117686 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 09 15:02:36 crc kubenswrapper[4555]: I1209 15:02:36.196101 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 09 15:02:36 crc kubenswrapper[4555]: I1209 15:02:36.219821 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 09 15:02:36 crc kubenswrapper[4555]: I1209 15:02:36.311730 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 09 15:02:36 crc kubenswrapper[4555]: I1209 15:02:36.325873 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 09 15:02:36 crc kubenswrapper[4555]: I1209 15:02:36.352032 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 09 15:02:36 crc kubenswrapper[4555]: I1209 15:02:36.584363 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 09 15:02:36 crc kubenswrapper[4555]: I1209 15:02:36.649732 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 09 15:02:36 crc kubenswrapper[4555]: I1209 15:02:36.691868 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 09 15:02:36 crc kubenswrapper[4555]: I1209 15:02:36.699818 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 09 15:02:36 crc kubenswrapper[4555]: I1209 15:02:36.707755 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 09 15:02:36 crc kubenswrapper[4555]: I1209 15:02:36.710518 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 09 15:02:36 crc kubenswrapper[4555]: I1209 15:02:36.720725 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 09 15:02:36 crc kubenswrapper[4555]: I1209 15:02:36.721753 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 09 15:02:36 crc kubenswrapper[4555]: I1209 15:02:36.760687 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 09 15:02:36 crc kubenswrapper[4555]: I1209 15:02:36.782812 4555 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 09 15:02:36 crc kubenswrapper[4555]: I1209 15:02:36.783040 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://a9f634a95c76b3a1a725d945590ee6d5166f8b1711fe56be36eb555c46b354bd" gracePeriod=5 Dec 09 15:02:36 crc kubenswrapper[4555]: I1209 15:02:36.843230 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 09 15:02:36 crc kubenswrapper[4555]: I1209 15:02:36.860774 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 09 15:02:36 crc kubenswrapper[4555]: I1209 15:02:36.862232 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 09 15:02:36 crc kubenswrapper[4555]: I1209 15:02:36.899491 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 09 15:02:36 crc kubenswrapper[4555]: I1209 15:02:36.899612 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 09 15:02:36 crc kubenswrapper[4555]: I1209 15:02:36.934289 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 09 15:02:36 crc kubenswrapper[4555]: I1209 15:02:36.954792 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 09 15:02:37 crc kubenswrapper[4555]: I1209 15:02:37.052705 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 09 15:02:37 crc kubenswrapper[4555]: I1209 15:02:37.119191 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 09 15:02:37 crc kubenswrapper[4555]: I1209 15:02:37.191049 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 09 15:02:37 crc kubenswrapper[4555]: I1209 15:02:37.292472 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 09 15:02:37 crc kubenswrapper[4555]: I1209 15:02:37.317102 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 09 15:02:37 crc kubenswrapper[4555]: I1209 15:02:37.331948 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 09 15:02:37 crc kubenswrapper[4555]: I1209 15:02:37.363240 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 09 15:02:37 crc kubenswrapper[4555]: I1209 15:02:37.380244 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 09 15:02:37 crc kubenswrapper[4555]: I1209 15:02:37.393175 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 09 15:02:37 crc kubenswrapper[4555]: I1209 15:02:37.416287 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 09 15:02:37 crc kubenswrapper[4555]: I1209 15:02:37.422663 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 09 15:02:37 crc kubenswrapper[4555]: I1209 15:02:37.429386 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 09 15:02:37 crc kubenswrapper[4555]: I1209 15:02:37.472128 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 09 15:02:37 crc kubenswrapper[4555]: I1209 15:02:37.479415 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 09 15:02:37 crc kubenswrapper[4555]: I1209 15:02:37.482705 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 09 15:02:37 crc kubenswrapper[4555]: I1209 15:02:37.490993 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 09 15:02:37 crc kubenswrapper[4555]: I1209 15:02:37.524294 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 09 15:02:37 crc kubenswrapper[4555]: I1209 15:02:37.640541 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 09 15:02:37 crc kubenswrapper[4555]: I1209 15:02:37.656082 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 09 15:02:37 crc kubenswrapper[4555]: I1209 15:02:37.680126 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 09 15:02:37 crc kubenswrapper[4555]: I1209 15:02:37.724870 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 09 15:02:37 crc kubenswrapper[4555]: I1209 15:02:37.747562 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 09 15:02:37 crc kubenswrapper[4555]: I1209 15:02:37.757260 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 09 15:02:37 crc kubenswrapper[4555]: I1209 15:02:37.842630 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 09 15:02:37 crc kubenswrapper[4555]: I1209 15:02:37.855782 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 09 15:02:37 crc kubenswrapper[4555]: I1209 15:02:37.899232 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 09 15:02:37 crc kubenswrapper[4555]: I1209 15:02:37.950476 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 09 15:02:37 crc kubenswrapper[4555]: I1209 15:02:37.961857 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.222604 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.263778 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.274256 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.292696 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.317148 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.487108 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bdb9r"] Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.487396 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bdb9r" podUID="fd5cca88-9272-46b2-89f1-b0c2b44e2a9f" containerName="registry-server" containerID="cri-o://4ec27fbe6f3735de959aa11530f42a6fce5037a4db3e8eaa07ccb4da6b1dd517" gracePeriod=30 Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.497277 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mc787"] Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.497552 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mc787" podUID="14dca34a-64a8-4190-aa9a-07c5cb2bdf33" containerName="registry-server" containerID="cri-o://f654a0b4c4ba6add363d7b89f0b30e83beacdd62392ff79cd31849ab5e4f0de7" gracePeriod=30 Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.510851 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-srlsw"] Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.511079 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-srlsw" podUID="8c1c0513-5d12-463c-b66d-c1a5d02eafc0" containerName="marketplace-operator" containerID="cri-o://703a32908d95de30c0a23d8b5ccf34acd82aa5d32fd58b62abf083a63d3aec9f" gracePeriod=30 Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.515934 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7d49r"] Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.516132 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7d49r" podUID="af79ef0a-fda8-4009-b915-297808ea4398" containerName="registry-server" containerID="cri-o://15719dafad2ef5fef86c9a7f2f4402239d8a2237de04f545d7663b71c380cc85" gracePeriod=30 Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.519044 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dwd6f"] Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.519191 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dwd6f" podUID="5b3ae0d5-62f7-4219-b006-f6a33d913018" containerName="registry-server" containerID="cri-o://173087e7be1476e4a92e54c73ed963e224973a215618d17758b9115edad9ad6b" gracePeriod=30 Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.548004 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pqkv4"] Dec 09 15:02:38 crc kubenswrapper[4555]: E1209 15:02:38.548181 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.548197 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 09 15:02:38 crc kubenswrapper[4555]: E1209 15:02:38.548213 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbdfa4d2-08c3-40dd-b100-622dceb965cd" containerName="installer" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.548219 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbdfa4d2-08c3-40dd-b100-622dceb965cd" containerName="installer" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.548659 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbdfa4d2-08c3-40dd-b100-622dceb965cd" containerName="installer" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.548684 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.548998 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pqkv4" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.556260 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pqkv4"] Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.569219 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.587174 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.612244 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.618148 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.636936 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ef9ab0f-e5f9-40b4-b761-666bd4e1832d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pqkv4\" (UID: \"9ef9ab0f-e5f9-40b4-b761-666bd4e1832d\") " pod="openshift-marketplace/marketplace-operator-79b997595-pqkv4" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.637386 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsz7z\" (UniqueName: \"kubernetes.io/projected/9ef9ab0f-e5f9-40b4-b761-666bd4e1832d-kube-api-access-vsz7z\") pod \"marketplace-operator-79b997595-pqkv4\" (UID: \"9ef9ab0f-e5f9-40b4-b761-666bd4e1832d\") " pod="openshift-marketplace/marketplace-operator-79b997595-pqkv4" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.637411 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9ef9ab0f-e5f9-40b4-b761-666bd4e1832d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pqkv4\" (UID: \"9ef9ab0f-e5f9-40b4-b761-666bd4e1832d\") " pod="openshift-marketplace/marketplace-operator-79b997595-pqkv4" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.649604 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.739159 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ef9ab0f-e5f9-40b4-b761-666bd4e1832d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pqkv4\" (UID: \"9ef9ab0f-e5f9-40b4-b761-666bd4e1832d\") " pod="openshift-marketplace/marketplace-operator-79b997595-pqkv4" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.739218 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsz7z\" (UniqueName: \"kubernetes.io/projected/9ef9ab0f-e5f9-40b4-b761-666bd4e1832d-kube-api-access-vsz7z\") pod \"marketplace-operator-79b997595-pqkv4\" (UID: \"9ef9ab0f-e5f9-40b4-b761-666bd4e1832d\") " pod="openshift-marketplace/marketplace-operator-79b997595-pqkv4" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.739238 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9ef9ab0f-e5f9-40b4-b761-666bd4e1832d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pqkv4\" (UID: \"9ef9ab0f-e5f9-40b4-b761-666bd4e1832d\") " pod="openshift-marketplace/marketplace-operator-79b997595-pqkv4" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.740895 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ef9ab0f-e5f9-40b4-b761-666bd4e1832d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pqkv4\" (UID: \"9ef9ab0f-e5f9-40b4-b761-666bd4e1832d\") " pod="openshift-marketplace/marketplace-operator-79b997595-pqkv4" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.745790 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9ef9ab0f-e5f9-40b4-b761-666bd4e1832d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pqkv4\" (UID: \"9ef9ab0f-e5f9-40b4-b761-666bd4e1832d\") " pod="openshift-marketplace/marketplace-operator-79b997595-pqkv4" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.754684 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsz7z\" (UniqueName: \"kubernetes.io/projected/9ef9ab0f-e5f9-40b4-b761-666bd4e1832d-kube-api-access-vsz7z\") pod \"marketplace-operator-79b997595-pqkv4\" (UID: \"9ef9ab0f-e5f9-40b4-b761-666bd4e1832d\") " pod="openshift-marketplace/marketplace-operator-79b997595-pqkv4" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.908392 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pqkv4" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.913095 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mc787" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.940828 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14dca34a-64a8-4190-aa9a-07c5cb2bdf33-utilities\") pod \"14dca34a-64a8-4190-aa9a-07c5cb2bdf33\" (UID: \"14dca34a-64a8-4190-aa9a-07c5cb2bdf33\") " Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.940919 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14dca34a-64a8-4190-aa9a-07c5cb2bdf33-catalog-content\") pod \"14dca34a-64a8-4190-aa9a-07c5cb2bdf33\" (UID: \"14dca34a-64a8-4190-aa9a-07c5cb2bdf33\") " Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.941006 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmvwx\" (UniqueName: \"kubernetes.io/projected/14dca34a-64a8-4190-aa9a-07c5cb2bdf33-kube-api-access-cmvwx\") pod \"14dca34a-64a8-4190-aa9a-07c5cb2bdf33\" (UID: \"14dca34a-64a8-4190-aa9a-07c5cb2bdf33\") " Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.942425 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14dca34a-64a8-4190-aa9a-07c5cb2bdf33-utilities" (OuterVolumeSpecName: "utilities") pod "14dca34a-64a8-4190-aa9a-07c5cb2bdf33" (UID: "14dca34a-64a8-4190-aa9a-07c5cb2bdf33"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.944787 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14dca34a-64a8-4190-aa9a-07c5cb2bdf33-kube-api-access-cmvwx" (OuterVolumeSpecName: "kube-api-access-cmvwx") pod "14dca34a-64a8-4190-aa9a-07c5cb2bdf33" (UID: "14dca34a-64a8-4190-aa9a-07c5cb2bdf33"). InnerVolumeSpecName "kube-api-access-cmvwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.949976 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bdb9r" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.952130 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7d49r" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.954224 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-srlsw" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.960779 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.966345 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dwd6f" Dec 09 15:02:38 crc kubenswrapper[4555]: I1209 15:02:38.974594 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:38.998591 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14dca34a-64a8-4190-aa9a-07c5cb2bdf33-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "14dca34a-64a8-4190-aa9a-07c5cb2bdf33" (UID: "14dca34a-64a8-4190-aa9a-07c5cb2bdf33"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.041538 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drswg\" (UniqueName: \"kubernetes.io/projected/fd5cca88-9272-46b2-89f1-b0c2b44e2a9f-kube-api-access-drswg\") pod \"fd5cca88-9272-46b2-89f1-b0c2b44e2a9f\" (UID: \"fd5cca88-9272-46b2-89f1-b0c2b44e2a9f\") " Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.041567 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbvv6\" (UniqueName: \"kubernetes.io/projected/af79ef0a-fda8-4009-b915-297808ea4398-kube-api-access-vbvv6\") pod \"af79ef0a-fda8-4009-b915-297808ea4398\" (UID: \"af79ef0a-fda8-4009-b915-297808ea4398\") " Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.041590 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd5cca88-9272-46b2-89f1-b0c2b44e2a9f-catalog-content\") pod \"fd5cca88-9272-46b2-89f1-b0c2b44e2a9f\" (UID: \"fd5cca88-9272-46b2-89f1-b0c2b44e2a9f\") " Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.041606 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd5cca88-9272-46b2-89f1-b0c2b44e2a9f-utilities\") pod \"fd5cca88-9272-46b2-89f1-b0c2b44e2a9f\" (UID: \"fd5cca88-9272-46b2-89f1-b0c2b44e2a9f\") " Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.041623 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b3ae0d5-62f7-4219-b006-f6a33d913018-catalog-content\") pod \"5b3ae0d5-62f7-4219-b006-f6a33d913018\" (UID: \"5b3ae0d5-62f7-4219-b006-f6a33d913018\") " Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.041653 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4dds\" (UniqueName: \"kubernetes.io/projected/8c1c0513-5d12-463c-b66d-c1a5d02eafc0-kube-api-access-z4dds\") pod \"8c1c0513-5d12-463c-b66d-c1a5d02eafc0\" (UID: \"8c1c0513-5d12-463c-b66d-c1a5d02eafc0\") " Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.041669 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b3ae0d5-62f7-4219-b006-f6a33d913018-utilities\") pod \"5b3ae0d5-62f7-4219-b006-f6a33d913018\" (UID: \"5b3ae0d5-62f7-4219-b006-f6a33d913018\") " Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.041687 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af79ef0a-fda8-4009-b915-297808ea4398-utilities\") pod \"af79ef0a-fda8-4009-b915-297808ea4398\" (UID: \"af79ef0a-fda8-4009-b915-297808ea4398\") " Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.041709 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af79ef0a-fda8-4009-b915-297808ea4398-catalog-content\") pod \"af79ef0a-fda8-4009-b915-297808ea4398\" (UID: \"af79ef0a-fda8-4009-b915-297808ea4398\") " Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.041726 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c1c0513-5d12-463c-b66d-c1a5d02eafc0-marketplace-trusted-ca\") pod \"8c1c0513-5d12-463c-b66d-c1a5d02eafc0\" (UID: \"8c1c0513-5d12-463c-b66d-c1a5d02eafc0\") " Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.041751 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8c1c0513-5d12-463c-b66d-c1a5d02eafc0-marketplace-operator-metrics\") pod \"8c1c0513-5d12-463c-b66d-c1a5d02eafc0\" (UID: \"8c1c0513-5d12-463c-b66d-c1a5d02eafc0\") " Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.041775 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rj6wx\" (UniqueName: \"kubernetes.io/projected/5b3ae0d5-62f7-4219-b006-f6a33d913018-kube-api-access-rj6wx\") pod \"5b3ae0d5-62f7-4219-b006-f6a33d913018\" (UID: \"5b3ae0d5-62f7-4219-b006-f6a33d913018\") " Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.041898 4555 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14dca34a-64a8-4190-aa9a-07c5cb2bdf33-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.041909 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmvwx\" (UniqueName: \"kubernetes.io/projected/14dca34a-64a8-4190-aa9a-07c5cb2bdf33-kube-api-access-cmvwx\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.041918 4555 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14dca34a-64a8-4190-aa9a-07c5cb2bdf33-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.042192 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd5cca88-9272-46b2-89f1-b0c2b44e2a9f-utilities" (OuterVolumeSpecName: "utilities") pod "fd5cca88-9272-46b2-89f1-b0c2b44e2a9f" (UID: "fd5cca88-9272-46b2-89f1-b0c2b44e2a9f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.042510 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af79ef0a-fda8-4009-b915-297808ea4398-utilities" (OuterVolumeSpecName: "utilities") pod "af79ef0a-fda8-4009-b915-297808ea4398" (UID: "af79ef0a-fda8-4009-b915-297808ea4398"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.042745 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c1c0513-5d12-463c-b66d-c1a5d02eafc0-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "8c1c0513-5d12-463c-b66d-c1a5d02eafc0" (UID: "8c1c0513-5d12-463c-b66d-c1a5d02eafc0"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.042741 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b3ae0d5-62f7-4219-b006-f6a33d913018-utilities" (OuterVolumeSpecName: "utilities") pod "5b3ae0d5-62f7-4219-b006-f6a33d913018" (UID: "5b3ae0d5-62f7-4219-b006-f6a33d913018"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.045408 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b3ae0d5-62f7-4219-b006-f6a33d913018-kube-api-access-rj6wx" (OuterVolumeSpecName: "kube-api-access-rj6wx") pod "5b3ae0d5-62f7-4219-b006-f6a33d913018" (UID: "5b3ae0d5-62f7-4219-b006-f6a33d913018"). InnerVolumeSpecName "kube-api-access-rj6wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.046093 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af79ef0a-fda8-4009-b915-297808ea4398-kube-api-access-vbvv6" (OuterVolumeSpecName: "kube-api-access-vbvv6") pod "af79ef0a-fda8-4009-b915-297808ea4398" (UID: "af79ef0a-fda8-4009-b915-297808ea4398"). InnerVolumeSpecName "kube-api-access-vbvv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.046458 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd5cca88-9272-46b2-89f1-b0c2b44e2a9f-kube-api-access-drswg" (OuterVolumeSpecName: "kube-api-access-drswg") pod "fd5cca88-9272-46b2-89f1-b0c2b44e2a9f" (UID: "fd5cca88-9272-46b2-89f1-b0c2b44e2a9f"). InnerVolumeSpecName "kube-api-access-drswg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.046634 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c1c0513-5d12-463c-b66d-c1a5d02eafc0-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "8c1c0513-5d12-463c-b66d-c1a5d02eafc0" (UID: "8c1c0513-5d12-463c-b66d-c1a5d02eafc0"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.047590 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c1c0513-5d12-463c-b66d-c1a5d02eafc0-kube-api-access-z4dds" (OuterVolumeSpecName: "kube-api-access-z4dds") pod "8c1c0513-5d12-463c-b66d-c1a5d02eafc0" (UID: "8c1c0513-5d12-463c-b66d-c1a5d02eafc0"). InnerVolumeSpecName "kube-api-access-z4dds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.060579 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af79ef0a-fda8-4009-b915-297808ea4398-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af79ef0a-fda8-4009-b915-297808ea4398" (UID: "af79ef0a-fda8-4009-b915-297808ea4398"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.090856 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.093490 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.096957 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd5cca88-9272-46b2-89f1-b0c2b44e2a9f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd5cca88-9272-46b2-89f1-b0c2b44e2a9f" (UID: "fd5cca88-9272-46b2-89f1-b0c2b44e2a9f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.115766 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.139068 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b3ae0d5-62f7-4219-b006-f6a33d913018-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5b3ae0d5-62f7-4219-b006-f6a33d913018" (UID: "5b3ae0d5-62f7-4219-b006-f6a33d913018"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.142362 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drswg\" (UniqueName: \"kubernetes.io/projected/fd5cca88-9272-46b2-89f1-b0c2b44e2a9f-kube-api-access-drswg\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.142388 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbvv6\" (UniqueName: \"kubernetes.io/projected/af79ef0a-fda8-4009-b915-297808ea4398-kube-api-access-vbvv6\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.142400 4555 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd5cca88-9272-46b2-89f1-b0c2b44e2a9f-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.142413 4555 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd5cca88-9272-46b2-89f1-b0c2b44e2a9f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.142423 4555 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b3ae0d5-62f7-4219-b006-f6a33d913018-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.142431 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4dds\" (UniqueName: \"kubernetes.io/projected/8c1c0513-5d12-463c-b66d-c1a5d02eafc0-kube-api-access-z4dds\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.142439 4555 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b3ae0d5-62f7-4219-b006-f6a33d913018-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.142447 4555 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af79ef0a-fda8-4009-b915-297808ea4398-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.142454 4555 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af79ef0a-fda8-4009-b915-297808ea4398-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.142462 4555 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c1c0513-5d12-463c-b66d-c1a5d02eafc0-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.142469 4555 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8c1c0513-5d12-463c-b66d-c1a5d02eafc0-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.142478 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rj6wx\" (UniqueName: \"kubernetes.io/projected/5b3ae0d5-62f7-4219-b006-f6a33d913018-kube-api-access-rj6wx\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.195218 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.218996 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.294203 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pqkv4"] Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.364076 4555 generic.go:334] "Generic (PLEG): container finished" podID="5b3ae0d5-62f7-4219-b006-f6a33d913018" containerID="173087e7be1476e4a92e54c73ed963e224973a215618d17758b9115edad9ad6b" exitCode=0 Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.364127 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dwd6f" event={"ID":"5b3ae0d5-62f7-4219-b006-f6a33d913018","Type":"ContainerDied","Data":"173087e7be1476e4a92e54c73ed963e224973a215618d17758b9115edad9ad6b"} Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.364149 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dwd6f" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.364184 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dwd6f" event={"ID":"5b3ae0d5-62f7-4219-b006-f6a33d913018","Type":"ContainerDied","Data":"f87e573a8339f21fd33bcff79510765cbd0fbea1c20eba86027f37f21ad75ea8"} Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.364203 4555 scope.go:117] "RemoveContainer" containerID="173087e7be1476e4a92e54c73ed963e224973a215618d17758b9115edad9ad6b" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.366037 4555 generic.go:334] "Generic (PLEG): container finished" podID="14dca34a-64a8-4190-aa9a-07c5cb2bdf33" containerID="f654a0b4c4ba6add363d7b89f0b30e83beacdd62392ff79cd31849ab5e4f0de7" exitCode=0 Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.366088 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc787" event={"ID":"14dca34a-64a8-4190-aa9a-07c5cb2bdf33","Type":"ContainerDied","Data":"f654a0b4c4ba6add363d7b89f0b30e83beacdd62392ff79cd31849ab5e4f0de7"} Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.366107 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc787" event={"ID":"14dca34a-64a8-4190-aa9a-07c5cb2bdf33","Type":"ContainerDied","Data":"d475c7287cf23068800595aa02c9ef28eb1d21373a0436cc3dcbbfac253d5807"} Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.366172 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mc787" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.369544 4555 generic.go:334] "Generic (PLEG): container finished" podID="af79ef0a-fda8-4009-b915-297808ea4398" containerID="15719dafad2ef5fef86c9a7f2f4402239d8a2237de04f545d7663b71c380cc85" exitCode=0 Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.369578 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7d49r" event={"ID":"af79ef0a-fda8-4009-b915-297808ea4398","Type":"ContainerDied","Data":"15719dafad2ef5fef86c9a7f2f4402239d8a2237de04f545d7663b71c380cc85"} Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.369593 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7d49r" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.369611 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7d49r" event={"ID":"af79ef0a-fda8-4009-b915-297808ea4398","Type":"ContainerDied","Data":"276a29ca5488592d7ef5b8e9413ef24bc37c27518ad68dd09944711293c88c88"} Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.371772 4555 generic.go:334] "Generic (PLEG): container finished" podID="fd5cca88-9272-46b2-89f1-b0c2b44e2a9f" containerID="4ec27fbe6f3735de959aa11530f42a6fce5037a4db3e8eaa07ccb4da6b1dd517" exitCode=0 Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.371828 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdb9r" event={"ID":"fd5cca88-9272-46b2-89f1-b0c2b44e2a9f","Type":"ContainerDied","Data":"4ec27fbe6f3735de959aa11530f42a6fce5037a4db3e8eaa07ccb4da6b1dd517"} Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.371847 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdb9r" event={"ID":"fd5cca88-9272-46b2-89f1-b0c2b44e2a9f","Type":"ContainerDied","Data":"1e7abbf7290011846f2420736a75def45163b58689e32ca69e75c8820223e8a7"} Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.371830 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bdb9r" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.373931 4555 generic.go:334] "Generic (PLEG): container finished" podID="8c1c0513-5d12-463c-b66d-c1a5d02eafc0" containerID="703a32908d95de30c0a23d8b5ccf34acd82aa5d32fd58b62abf083a63d3aec9f" exitCode=0 Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.373992 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-srlsw" event={"ID":"8c1c0513-5d12-463c-b66d-c1a5d02eafc0","Type":"ContainerDied","Data":"703a32908d95de30c0a23d8b5ccf34acd82aa5d32fd58b62abf083a63d3aec9f"} Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.374017 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-srlsw" event={"ID":"8c1c0513-5d12-463c-b66d-c1a5d02eafc0","Type":"ContainerDied","Data":"ef60cc6953dc083a896c67ecc9c9e87895dedaf752c579a448ca1aa93cabd0ac"} Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.374068 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-srlsw" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.378836 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pqkv4" event={"ID":"9ef9ab0f-e5f9-40b4-b761-666bd4e1832d","Type":"ContainerStarted","Data":"4cbac516b0e60ff8e20cb9a136aba32e894c61056e4d3bf334cbd80db8a14860"} Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.385953 4555 scope.go:117] "RemoveContainer" containerID="eea80f8422e2e199412dce75228dfb0549eb133fedcf9987ada61eac543bf64e" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.400993 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mc787"] Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.404528 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mc787"] Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.416933 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7d49r"] Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.419864 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7d49r"] Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.423123 4555 scope.go:117] "RemoveContainer" containerID="0890db3247a8baaba0f4d0ff9a3f2b9b777f2a6f361280e6fb2e22cee4c464a5" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.427101 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.427279 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dwd6f"] Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.432929 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dwd6f"] Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.434400 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bdb9r"] Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.436698 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bdb9r"] Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.443627 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-srlsw"] Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.446441 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-srlsw"] Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.448454 4555 scope.go:117] "RemoveContainer" containerID="173087e7be1476e4a92e54c73ed963e224973a215618d17758b9115edad9ad6b" Dec 09 15:02:39 crc kubenswrapper[4555]: E1209 15:02:39.448880 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"173087e7be1476e4a92e54c73ed963e224973a215618d17758b9115edad9ad6b\": container with ID starting with 173087e7be1476e4a92e54c73ed963e224973a215618d17758b9115edad9ad6b not found: ID does not exist" containerID="173087e7be1476e4a92e54c73ed963e224973a215618d17758b9115edad9ad6b" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.448911 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"173087e7be1476e4a92e54c73ed963e224973a215618d17758b9115edad9ad6b"} err="failed to get container status \"173087e7be1476e4a92e54c73ed963e224973a215618d17758b9115edad9ad6b\": rpc error: code = NotFound desc = could not find container \"173087e7be1476e4a92e54c73ed963e224973a215618d17758b9115edad9ad6b\": container with ID starting with 173087e7be1476e4a92e54c73ed963e224973a215618d17758b9115edad9ad6b not found: ID does not exist" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.448933 4555 scope.go:117] "RemoveContainer" containerID="eea80f8422e2e199412dce75228dfb0549eb133fedcf9987ada61eac543bf64e" Dec 09 15:02:39 crc kubenswrapper[4555]: E1209 15:02:39.449426 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eea80f8422e2e199412dce75228dfb0549eb133fedcf9987ada61eac543bf64e\": container with ID starting with eea80f8422e2e199412dce75228dfb0549eb133fedcf9987ada61eac543bf64e not found: ID does not exist" containerID="eea80f8422e2e199412dce75228dfb0549eb133fedcf9987ada61eac543bf64e" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.449456 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eea80f8422e2e199412dce75228dfb0549eb133fedcf9987ada61eac543bf64e"} err="failed to get container status \"eea80f8422e2e199412dce75228dfb0549eb133fedcf9987ada61eac543bf64e\": rpc error: code = NotFound desc = could not find container \"eea80f8422e2e199412dce75228dfb0549eb133fedcf9987ada61eac543bf64e\": container with ID starting with eea80f8422e2e199412dce75228dfb0549eb133fedcf9987ada61eac543bf64e not found: ID does not exist" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.449478 4555 scope.go:117] "RemoveContainer" containerID="0890db3247a8baaba0f4d0ff9a3f2b9b777f2a6f361280e6fb2e22cee4c464a5" Dec 09 15:02:39 crc kubenswrapper[4555]: E1209 15:02:39.449820 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0890db3247a8baaba0f4d0ff9a3f2b9b777f2a6f361280e6fb2e22cee4c464a5\": container with ID starting with 0890db3247a8baaba0f4d0ff9a3f2b9b777f2a6f361280e6fb2e22cee4c464a5 not found: ID does not exist" containerID="0890db3247a8baaba0f4d0ff9a3f2b9b777f2a6f361280e6fb2e22cee4c464a5" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.449841 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0890db3247a8baaba0f4d0ff9a3f2b9b777f2a6f361280e6fb2e22cee4c464a5"} err="failed to get container status \"0890db3247a8baaba0f4d0ff9a3f2b9b777f2a6f361280e6fb2e22cee4c464a5\": rpc error: code = NotFound desc = could not find container \"0890db3247a8baaba0f4d0ff9a3f2b9b777f2a6f361280e6fb2e22cee4c464a5\": container with ID starting with 0890db3247a8baaba0f4d0ff9a3f2b9b777f2a6f361280e6fb2e22cee4c464a5 not found: ID does not exist" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.449859 4555 scope.go:117] "RemoveContainer" containerID="f654a0b4c4ba6add363d7b89f0b30e83beacdd62392ff79cd31849ab5e4f0de7" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.464995 4555 scope.go:117] "RemoveContainer" containerID="bbd7dd5281f9d575736eacd577f826add0efd6e39abe6e8f4740fc58244951d4" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.479503 4555 scope.go:117] "RemoveContainer" containerID="cf218e03cb5d7f91d68b277fb9b05d0ac2942940e11bbd8bc9073aa7c0ba1b29" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.493508 4555 scope.go:117] "RemoveContainer" containerID="f654a0b4c4ba6add363d7b89f0b30e83beacdd62392ff79cd31849ab5e4f0de7" Dec 09 15:02:39 crc kubenswrapper[4555]: E1209 15:02:39.493860 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f654a0b4c4ba6add363d7b89f0b30e83beacdd62392ff79cd31849ab5e4f0de7\": container with ID starting with f654a0b4c4ba6add363d7b89f0b30e83beacdd62392ff79cd31849ab5e4f0de7 not found: ID does not exist" containerID="f654a0b4c4ba6add363d7b89f0b30e83beacdd62392ff79cd31849ab5e4f0de7" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.493888 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f654a0b4c4ba6add363d7b89f0b30e83beacdd62392ff79cd31849ab5e4f0de7"} err="failed to get container status \"f654a0b4c4ba6add363d7b89f0b30e83beacdd62392ff79cd31849ab5e4f0de7\": rpc error: code = NotFound desc = could not find container \"f654a0b4c4ba6add363d7b89f0b30e83beacdd62392ff79cd31849ab5e4f0de7\": container with ID starting with f654a0b4c4ba6add363d7b89f0b30e83beacdd62392ff79cd31849ab5e4f0de7 not found: ID does not exist" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.493905 4555 scope.go:117] "RemoveContainer" containerID="bbd7dd5281f9d575736eacd577f826add0efd6e39abe6e8f4740fc58244951d4" Dec 09 15:02:39 crc kubenswrapper[4555]: E1209 15:02:39.494282 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbd7dd5281f9d575736eacd577f826add0efd6e39abe6e8f4740fc58244951d4\": container with ID starting with bbd7dd5281f9d575736eacd577f826add0efd6e39abe6e8f4740fc58244951d4 not found: ID does not exist" containerID="bbd7dd5281f9d575736eacd577f826add0efd6e39abe6e8f4740fc58244951d4" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.494330 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbd7dd5281f9d575736eacd577f826add0efd6e39abe6e8f4740fc58244951d4"} err="failed to get container status \"bbd7dd5281f9d575736eacd577f826add0efd6e39abe6e8f4740fc58244951d4\": rpc error: code = NotFound desc = could not find container \"bbd7dd5281f9d575736eacd577f826add0efd6e39abe6e8f4740fc58244951d4\": container with ID starting with bbd7dd5281f9d575736eacd577f826add0efd6e39abe6e8f4740fc58244951d4 not found: ID does not exist" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.494355 4555 scope.go:117] "RemoveContainer" containerID="cf218e03cb5d7f91d68b277fb9b05d0ac2942940e11bbd8bc9073aa7c0ba1b29" Dec 09 15:02:39 crc kubenswrapper[4555]: E1209 15:02:39.494755 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf218e03cb5d7f91d68b277fb9b05d0ac2942940e11bbd8bc9073aa7c0ba1b29\": container with ID starting with cf218e03cb5d7f91d68b277fb9b05d0ac2942940e11bbd8bc9073aa7c0ba1b29 not found: ID does not exist" containerID="cf218e03cb5d7f91d68b277fb9b05d0ac2942940e11bbd8bc9073aa7c0ba1b29" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.494782 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf218e03cb5d7f91d68b277fb9b05d0ac2942940e11bbd8bc9073aa7c0ba1b29"} err="failed to get container status \"cf218e03cb5d7f91d68b277fb9b05d0ac2942940e11bbd8bc9073aa7c0ba1b29\": rpc error: code = NotFound desc = could not find container \"cf218e03cb5d7f91d68b277fb9b05d0ac2942940e11bbd8bc9073aa7c0ba1b29\": container with ID starting with cf218e03cb5d7f91d68b277fb9b05d0ac2942940e11bbd8bc9073aa7c0ba1b29 not found: ID does not exist" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.494796 4555 scope.go:117] "RemoveContainer" containerID="15719dafad2ef5fef86c9a7f2f4402239d8a2237de04f545d7663b71c380cc85" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.509314 4555 scope.go:117] "RemoveContainer" containerID="88b8aa6eaa59af9e33fdc13805932789b6f19cb1de086b3f84c70abf6859e5d9" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.521679 4555 scope.go:117] "RemoveContainer" containerID="5117c692e72c56ee79972900b8daff62f2eec8c0ce9cb60398e10ca0c4581d24" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.531326 4555 scope.go:117] "RemoveContainer" containerID="15719dafad2ef5fef86c9a7f2f4402239d8a2237de04f545d7663b71c380cc85" Dec 09 15:02:39 crc kubenswrapper[4555]: E1209 15:02:39.531594 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15719dafad2ef5fef86c9a7f2f4402239d8a2237de04f545d7663b71c380cc85\": container with ID starting with 15719dafad2ef5fef86c9a7f2f4402239d8a2237de04f545d7663b71c380cc85 not found: ID does not exist" containerID="15719dafad2ef5fef86c9a7f2f4402239d8a2237de04f545d7663b71c380cc85" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.531621 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15719dafad2ef5fef86c9a7f2f4402239d8a2237de04f545d7663b71c380cc85"} err="failed to get container status \"15719dafad2ef5fef86c9a7f2f4402239d8a2237de04f545d7663b71c380cc85\": rpc error: code = NotFound desc = could not find container \"15719dafad2ef5fef86c9a7f2f4402239d8a2237de04f545d7663b71c380cc85\": container with ID starting with 15719dafad2ef5fef86c9a7f2f4402239d8a2237de04f545d7663b71c380cc85 not found: ID does not exist" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.531643 4555 scope.go:117] "RemoveContainer" containerID="88b8aa6eaa59af9e33fdc13805932789b6f19cb1de086b3f84c70abf6859e5d9" Dec 09 15:02:39 crc kubenswrapper[4555]: E1209 15:02:39.531848 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88b8aa6eaa59af9e33fdc13805932789b6f19cb1de086b3f84c70abf6859e5d9\": container with ID starting with 88b8aa6eaa59af9e33fdc13805932789b6f19cb1de086b3f84c70abf6859e5d9 not found: ID does not exist" containerID="88b8aa6eaa59af9e33fdc13805932789b6f19cb1de086b3f84c70abf6859e5d9" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.531877 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88b8aa6eaa59af9e33fdc13805932789b6f19cb1de086b3f84c70abf6859e5d9"} err="failed to get container status \"88b8aa6eaa59af9e33fdc13805932789b6f19cb1de086b3f84c70abf6859e5d9\": rpc error: code = NotFound desc = could not find container \"88b8aa6eaa59af9e33fdc13805932789b6f19cb1de086b3f84c70abf6859e5d9\": container with ID starting with 88b8aa6eaa59af9e33fdc13805932789b6f19cb1de086b3f84c70abf6859e5d9 not found: ID does not exist" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.531892 4555 scope.go:117] "RemoveContainer" containerID="5117c692e72c56ee79972900b8daff62f2eec8c0ce9cb60398e10ca0c4581d24" Dec 09 15:02:39 crc kubenswrapper[4555]: E1209 15:02:39.532136 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5117c692e72c56ee79972900b8daff62f2eec8c0ce9cb60398e10ca0c4581d24\": container with ID starting with 5117c692e72c56ee79972900b8daff62f2eec8c0ce9cb60398e10ca0c4581d24 not found: ID does not exist" containerID="5117c692e72c56ee79972900b8daff62f2eec8c0ce9cb60398e10ca0c4581d24" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.532151 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5117c692e72c56ee79972900b8daff62f2eec8c0ce9cb60398e10ca0c4581d24"} err="failed to get container status \"5117c692e72c56ee79972900b8daff62f2eec8c0ce9cb60398e10ca0c4581d24\": rpc error: code = NotFound desc = could not find container \"5117c692e72c56ee79972900b8daff62f2eec8c0ce9cb60398e10ca0c4581d24\": container with ID starting with 5117c692e72c56ee79972900b8daff62f2eec8c0ce9cb60398e10ca0c4581d24 not found: ID does not exist" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.532163 4555 scope.go:117] "RemoveContainer" containerID="4ec27fbe6f3735de959aa11530f42a6fce5037a4db3e8eaa07ccb4da6b1dd517" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.544330 4555 scope.go:117] "RemoveContainer" containerID="179c6fd2fbf50210d96266ee240dfacdb0a2b5a8a35f6d7f9336a6e50cbe18c8" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.548861 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.555288 4555 scope.go:117] "RemoveContainer" containerID="1eef4e539ca6e125894f0c0150478b564bac0d394786f6aa00d0b1ef2d5ae2b1" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.565588 4555 scope.go:117] "RemoveContainer" containerID="4ec27fbe6f3735de959aa11530f42a6fce5037a4db3e8eaa07ccb4da6b1dd517" Dec 09 15:02:39 crc kubenswrapper[4555]: E1209 15:02:39.565943 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ec27fbe6f3735de959aa11530f42a6fce5037a4db3e8eaa07ccb4da6b1dd517\": container with ID starting with 4ec27fbe6f3735de959aa11530f42a6fce5037a4db3e8eaa07ccb4da6b1dd517 not found: ID does not exist" containerID="4ec27fbe6f3735de959aa11530f42a6fce5037a4db3e8eaa07ccb4da6b1dd517" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.565985 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ec27fbe6f3735de959aa11530f42a6fce5037a4db3e8eaa07ccb4da6b1dd517"} err="failed to get container status \"4ec27fbe6f3735de959aa11530f42a6fce5037a4db3e8eaa07ccb4da6b1dd517\": rpc error: code = NotFound desc = could not find container \"4ec27fbe6f3735de959aa11530f42a6fce5037a4db3e8eaa07ccb4da6b1dd517\": container with ID starting with 4ec27fbe6f3735de959aa11530f42a6fce5037a4db3e8eaa07ccb4da6b1dd517 not found: ID does not exist" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.566008 4555 scope.go:117] "RemoveContainer" containerID="179c6fd2fbf50210d96266ee240dfacdb0a2b5a8a35f6d7f9336a6e50cbe18c8" Dec 09 15:02:39 crc kubenswrapper[4555]: E1209 15:02:39.566331 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"179c6fd2fbf50210d96266ee240dfacdb0a2b5a8a35f6d7f9336a6e50cbe18c8\": container with ID starting with 179c6fd2fbf50210d96266ee240dfacdb0a2b5a8a35f6d7f9336a6e50cbe18c8 not found: ID does not exist" containerID="179c6fd2fbf50210d96266ee240dfacdb0a2b5a8a35f6d7f9336a6e50cbe18c8" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.566353 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"179c6fd2fbf50210d96266ee240dfacdb0a2b5a8a35f6d7f9336a6e50cbe18c8"} err="failed to get container status \"179c6fd2fbf50210d96266ee240dfacdb0a2b5a8a35f6d7f9336a6e50cbe18c8\": rpc error: code = NotFound desc = could not find container \"179c6fd2fbf50210d96266ee240dfacdb0a2b5a8a35f6d7f9336a6e50cbe18c8\": container with ID starting with 179c6fd2fbf50210d96266ee240dfacdb0a2b5a8a35f6d7f9336a6e50cbe18c8 not found: ID does not exist" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.566367 4555 scope.go:117] "RemoveContainer" containerID="1eef4e539ca6e125894f0c0150478b564bac0d394786f6aa00d0b1ef2d5ae2b1" Dec 09 15:02:39 crc kubenswrapper[4555]: E1209 15:02:39.566640 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eef4e539ca6e125894f0c0150478b564bac0d394786f6aa00d0b1ef2d5ae2b1\": container with ID starting with 1eef4e539ca6e125894f0c0150478b564bac0d394786f6aa00d0b1ef2d5ae2b1 not found: ID does not exist" containerID="1eef4e539ca6e125894f0c0150478b564bac0d394786f6aa00d0b1ef2d5ae2b1" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.566662 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eef4e539ca6e125894f0c0150478b564bac0d394786f6aa00d0b1ef2d5ae2b1"} err="failed to get container status \"1eef4e539ca6e125894f0c0150478b564bac0d394786f6aa00d0b1ef2d5ae2b1\": rpc error: code = NotFound desc = could not find container \"1eef4e539ca6e125894f0c0150478b564bac0d394786f6aa00d0b1ef2d5ae2b1\": container with ID starting with 1eef4e539ca6e125894f0c0150478b564bac0d394786f6aa00d0b1ef2d5ae2b1 not found: ID does not exist" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.566678 4555 scope.go:117] "RemoveContainer" containerID="703a32908d95de30c0a23d8b5ccf34acd82aa5d32fd58b62abf083a63d3aec9f" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.575427 4555 scope.go:117] "RemoveContainer" containerID="703a32908d95de30c0a23d8b5ccf34acd82aa5d32fd58b62abf083a63d3aec9f" Dec 09 15:02:39 crc kubenswrapper[4555]: E1209 15:02:39.575691 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"703a32908d95de30c0a23d8b5ccf34acd82aa5d32fd58b62abf083a63d3aec9f\": container with ID starting with 703a32908d95de30c0a23d8b5ccf34acd82aa5d32fd58b62abf083a63d3aec9f not found: ID does not exist" containerID="703a32908d95de30c0a23d8b5ccf34acd82aa5d32fd58b62abf083a63d3aec9f" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.575722 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"703a32908d95de30c0a23d8b5ccf34acd82aa5d32fd58b62abf083a63d3aec9f"} err="failed to get container status \"703a32908d95de30c0a23d8b5ccf34acd82aa5d32fd58b62abf083a63d3aec9f\": rpc error: code = NotFound desc = could not find container \"703a32908d95de30c0a23d8b5ccf34acd82aa5d32fd58b62abf083a63d3aec9f\": container with ID starting with 703a32908d95de30c0a23d8b5ccf34acd82aa5d32fd58b62abf083a63d3aec9f not found: ID does not exist" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.651323 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.878732 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 09 15:02:39 crc kubenswrapper[4555]: I1209 15:02:39.966477 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 09 15:02:40 crc kubenswrapper[4555]: I1209 15:02:40.032946 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 09 15:02:40 crc kubenswrapper[4555]: I1209 15:02:40.142070 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 09 15:02:40 crc kubenswrapper[4555]: I1209 15:02:40.167140 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14dca34a-64a8-4190-aa9a-07c5cb2bdf33" path="/var/lib/kubelet/pods/14dca34a-64a8-4190-aa9a-07c5cb2bdf33/volumes" Dec 09 15:02:40 crc kubenswrapper[4555]: I1209 15:02:40.167702 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b3ae0d5-62f7-4219-b006-f6a33d913018" path="/var/lib/kubelet/pods/5b3ae0d5-62f7-4219-b006-f6a33d913018/volumes" Dec 09 15:02:40 crc kubenswrapper[4555]: I1209 15:02:40.168270 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c1c0513-5d12-463c-b66d-c1a5d02eafc0" path="/var/lib/kubelet/pods/8c1c0513-5d12-463c-b66d-c1a5d02eafc0/volumes" Dec 09 15:02:40 crc kubenswrapper[4555]: I1209 15:02:40.168686 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af79ef0a-fda8-4009-b915-297808ea4398" path="/var/lib/kubelet/pods/af79ef0a-fda8-4009-b915-297808ea4398/volumes" Dec 09 15:02:40 crc kubenswrapper[4555]: I1209 15:02:40.169207 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd5cca88-9272-46b2-89f1-b0c2b44e2a9f" path="/var/lib/kubelet/pods/fd5cca88-9272-46b2-89f1-b0c2b44e2a9f/volumes" Dec 09 15:02:40 crc kubenswrapper[4555]: I1209 15:02:40.175208 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 09 15:02:40 crc kubenswrapper[4555]: I1209 15:02:40.289832 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 09 15:02:40 crc kubenswrapper[4555]: I1209 15:02:40.386739 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pqkv4" event={"ID":"9ef9ab0f-e5f9-40b4-b761-666bd4e1832d","Type":"ContainerStarted","Data":"6b24651d1e12556bd403ab19e82cfd2892a1b758890d5e0700bc55aeaecd9d31"} Dec 09 15:02:40 crc kubenswrapper[4555]: I1209 15:02:40.387485 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-pqkv4" Dec 09 15:02:40 crc kubenswrapper[4555]: I1209 15:02:40.393569 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-pqkv4" Dec 09 15:02:40 crc kubenswrapper[4555]: I1209 15:02:40.397960 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 09 15:02:40 crc kubenswrapper[4555]: I1209 15:02:40.404523 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-pqkv4" podStartSLOduration=2.404509898 podStartE2EDuration="2.404509898s" podCreationTimestamp="2025-12-09 15:02:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:02:40.401208154 +0000 UTC m=+270.537920872" watchObservedRunningTime="2025-12-09 15:02:40.404509898 +0000 UTC m=+270.541222616" Dec 09 15:02:40 crc kubenswrapper[4555]: I1209 15:02:40.693099 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 09 15:02:40 crc kubenswrapper[4555]: I1209 15:02:40.760283 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 09 15:02:40 crc kubenswrapper[4555]: I1209 15:02:40.850829 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 09 15:02:40 crc kubenswrapper[4555]: I1209 15:02:40.959871 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 09 15:02:41 crc kubenswrapper[4555]: I1209 15:02:41.074466 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 09 15:02:41 crc kubenswrapper[4555]: I1209 15:02:41.085177 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 09 15:02:41 crc kubenswrapper[4555]: I1209 15:02:41.135036 4555 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 09 15:02:41 crc kubenswrapper[4555]: I1209 15:02:41.179075 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 09 15:02:41 crc kubenswrapper[4555]: I1209 15:02:41.331731 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 09 15:02:41 crc kubenswrapper[4555]: I1209 15:02:41.353379 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 09 15:02:41 crc kubenswrapper[4555]: I1209 15:02:41.381964 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 09 15:02:41 crc kubenswrapper[4555]: I1209 15:02:41.566597 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 09 15:02:41 crc kubenswrapper[4555]: I1209 15:02:41.592894 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 09 15:02:41 crc kubenswrapper[4555]: I1209 15:02:41.617681 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 09 15:02:41 crc kubenswrapper[4555]: I1209 15:02:41.659061 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 09 15:02:41 crc kubenswrapper[4555]: I1209 15:02:41.686330 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 09 15:02:41 crc kubenswrapper[4555]: I1209 15:02:41.863649 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 09 15:02:42 crc kubenswrapper[4555]: I1209 15:02:42.336376 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 09 15:02:42 crc kubenswrapper[4555]: I1209 15:02:42.336439 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:02:42 crc kubenswrapper[4555]: I1209 15:02:42.385281 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 09 15:02:42 crc kubenswrapper[4555]: I1209 15:02:42.388856 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 09 15:02:42 crc kubenswrapper[4555]: I1209 15:02:42.398171 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 09 15:02:42 crc kubenswrapper[4555]: I1209 15:02:42.398210 4555 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="a9f634a95c76b3a1a725d945590ee6d5166f8b1711fe56be36eb555c46b354bd" exitCode=137 Dec 09 15:02:42 crc kubenswrapper[4555]: I1209 15:02:42.398478 4555 scope.go:117] "RemoveContainer" containerID="a9f634a95c76b3a1a725d945590ee6d5166f8b1711fe56be36eb555c46b354bd" Dec 09 15:02:42 crc kubenswrapper[4555]: I1209 15:02:42.398601 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:02:42 crc kubenswrapper[4555]: I1209 15:02:42.416163 4555 scope.go:117] "RemoveContainer" containerID="a9f634a95c76b3a1a725d945590ee6d5166f8b1711fe56be36eb555c46b354bd" Dec 09 15:02:42 crc kubenswrapper[4555]: E1209 15:02:42.416566 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9f634a95c76b3a1a725d945590ee6d5166f8b1711fe56be36eb555c46b354bd\": container with ID starting with a9f634a95c76b3a1a725d945590ee6d5166f8b1711fe56be36eb555c46b354bd not found: ID does not exist" containerID="a9f634a95c76b3a1a725d945590ee6d5166f8b1711fe56be36eb555c46b354bd" Dec 09 15:02:42 crc kubenswrapper[4555]: I1209 15:02:42.416593 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9f634a95c76b3a1a725d945590ee6d5166f8b1711fe56be36eb555c46b354bd"} err="failed to get container status \"a9f634a95c76b3a1a725d945590ee6d5166f8b1711fe56be36eb555c46b354bd\": rpc error: code = NotFound desc = could not find container \"a9f634a95c76b3a1a725d945590ee6d5166f8b1711fe56be36eb555c46b354bd\": container with ID starting with a9f634a95c76b3a1a725d945590ee6d5166f8b1711fe56be36eb555c46b354bd not found: ID does not exist" Dec 09 15:02:42 crc kubenswrapper[4555]: I1209 15:02:42.478281 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 09 15:02:42 crc kubenswrapper[4555]: I1209 15:02:42.478610 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 09 15:02:42 crc kubenswrapper[4555]: I1209 15:02:42.478643 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 09 15:02:42 crc kubenswrapper[4555]: I1209 15:02:42.478670 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 09 15:02:42 crc kubenswrapper[4555]: I1209 15:02:42.478717 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:02:42 crc kubenswrapper[4555]: I1209 15:02:42.478734 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 09 15:02:42 crc kubenswrapper[4555]: I1209 15:02:42.478759 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:02:42 crc kubenswrapper[4555]: I1209 15:02:42.478787 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:02:42 crc kubenswrapper[4555]: I1209 15:02:42.478855 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:02:42 crc kubenswrapper[4555]: I1209 15:02:42.479040 4555 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:42 crc kubenswrapper[4555]: I1209 15:02:42.479056 4555 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:42 crc kubenswrapper[4555]: I1209 15:02:42.479066 4555 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:42 crc kubenswrapper[4555]: I1209 15:02:42.479074 4555 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:42 crc kubenswrapper[4555]: I1209 15:02:42.484871 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:02:42 crc kubenswrapper[4555]: I1209 15:02:42.580072 4555 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:42 crc kubenswrapper[4555]: I1209 15:02:42.678499 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 09 15:02:42 crc kubenswrapper[4555]: I1209 15:02:42.998073 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 09 15:02:44 crc kubenswrapper[4555]: I1209 15:02:44.167606 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 09 15:02:44 crc kubenswrapper[4555]: I1209 15:02:44.168407 4555 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 09 15:02:44 crc kubenswrapper[4555]: I1209 15:02:44.177243 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 09 15:02:44 crc kubenswrapper[4555]: I1209 15:02:44.177283 4555 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="46ad94d9-9301-4c96-9444-03bbd18f496d" Dec 09 15:02:44 crc kubenswrapper[4555]: I1209 15:02:44.180122 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 09 15:02:44 crc kubenswrapper[4555]: I1209 15:02:44.180172 4555 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="46ad94d9-9301-4c96-9444-03bbd18f496d" Dec 09 15:02:45 crc kubenswrapper[4555]: I1209 15:02:45.318282 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.009600 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-9ds7z"] Dec 09 15:03:07 crc kubenswrapper[4555]: E1209 15:03:07.010835 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14dca34a-64a8-4190-aa9a-07c5cb2bdf33" containerName="registry-server" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.010913 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="14dca34a-64a8-4190-aa9a-07c5cb2bdf33" containerName="registry-server" Dec 09 15:03:07 crc kubenswrapper[4555]: E1209 15:03:07.010976 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b3ae0d5-62f7-4219-b006-f6a33d913018" containerName="registry-server" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.011030 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b3ae0d5-62f7-4219-b006-f6a33d913018" containerName="registry-server" Dec 09 15:03:07 crc kubenswrapper[4555]: E1209 15:03:07.011098 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5cca88-9272-46b2-89f1-b0c2b44e2a9f" containerName="extract-utilities" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.011157 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5cca88-9272-46b2-89f1-b0c2b44e2a9f" containerName="extract-utilities" Dec 09 15:03:07 crc kubenswrapper[4555]: E1209 15:03:07.011218 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b3ae0d5-62f7-4219-b006-f6a33d913018" containerName="extract-content" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.011286 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b3ae0d5-62f7-4219-b006-f6a33d913018" containerName="extract-content" Dec 09 15:03:07 crc kubenswrapper[4555]: E1209 15:03:07.011373 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af79ef0a-fda8-4009-b915-297808ea4398" containerName="extract-utilities" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.011425 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="af79ef0a-fda8-4009-b915-297808ea4398" containerName="extract-utilities" Dec 09 15:03:07 crc kubenswrapper[4555]: E1209 15:03:07.011476 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5cca88-9272-46b2-89f1-b0c2b44e2a9f" containerName="extract-content" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.011520 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5cca88-9272-46b2-89f1-b0c2b44e2a9f" containerName="extract-content" Dec 09 15:03:07 crc kubenswrapper[4555]: E1209 15:03:07.011570 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af79ef0a-fda8-4009-b915-297808ea4398" containerName="extract-content" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.011622 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="af79ef0a-fda8-4009-b915-297808ea4398" containerName="extract-content" Dec 09 15:03:07 crc kubenswrapper[4555]: E1209 15:03:07.011671 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c1c0513-5d12-463c-b66d-c1a5d02eafc0" containerName="marketplace-operator" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.011718 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c1c0513-5d12-463c-b66d-c1a5d02eafc0" containerName="marketplace-operator" Dec 09 15:03:07 crc kubenswrapper[4555]: E1209 15:03:07.011770 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5cca88-9272-46b2-89f1-b0c2b44e2a9f" containerName="registry-server" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.011819 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5cca88-9272-46b2-89f1-b0c2b44e2a9f" containerName="registry-server" Dec 09 15:03:07 crc kubenswrapper[4555]: E1209 15:03:07.011867 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14dca34a-64a8-4190-aa9a-07c5cb2bdf33" containerName="extract-utilities" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.011914 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="14dca34a-64a8-4190-aa9a-07c5cb2bdf33" containerName="extract-utilities" Dec 09 15:03:07 crc kubenswrapper[4555]: E1209 15:03:07.011971 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af79ef0a-fda8-4009-b915-297808ea4398" containerName="registry-server" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.012018 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="af79ef0a-fda8-4009-b915-297808ea4398" containerName="registry-server" Dec 09 15:03:07 crc kubenswrapper[4555]: E1209 15:03:07.012068 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b3ae0d5-62f7-4219-b006-f6a33d913018" containerName="extract-utilities" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.012119 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b3ae0d5-62f7-4219-b006-f6a33d913018" containerName="extract-utilities" Dec 09 15:03:07 crc kubenswrapper[4555]: E1209 15:03:07.012170 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14dca34a-64a8-4190-aa9a-07c5cb2bdf33" containerName="extract-content" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.012221 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="14dca34a-64a8-4190-aa9a-07c5cb2bdf33" containerName="extract-content" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.012372 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b3ae0d5-62f7-4219-b006-f6a33d913018" containerName="registry-server" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.012458 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="14dca34a-64a8-4190-aa9a-07c5cb2bdf33" containerName="registry-server" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.012513 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd5cca88-9272-46b2-89f1-b0c2b44e2a9f" containerName="registry-server" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.012560 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="af79ef0a-fda8-4009-b915-297808ea4398" containerName="registry-server" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.012610 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c1c0513-5d12-463c-b66d-c1a5d02eafc0" containerName="marketplace-operator" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.012975 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-9ds7z" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.014737 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.016170 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.016280 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-dockercfg-wwt9l" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.016822 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.017940 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.020856 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-9ds7z"] Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.039733 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bac5977-379a-4f6c-920a-3a66e1d5a1d1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-9ds7z\" (UID: \"8bac5977-379a-4f6c-920a-3a66e1d5a1d1\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-9ds7z" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.039833 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8bac5977-379a-4f6c-920a-3a66e1d5a1d1-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-9ds7z\" (UID: \"8bac5977-379a-4f6c-920a-3a66e1d5a1d1\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-9ds7z" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.039895 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v58xl\" (UniqueName: \"kubernetes.io/projected/8bac5977-379a-4f6c-920a-3a66e1d5a1d1-kube-api-access-v58xl\") pod \"cluster-monitoring-operator-6d5b84845-9ds7z\" (UID: \"8bac5977-379a-4f6c-920a-3a66e1d5a1d1\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-9ds7z" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.140594 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8bac5977-379a-4f6c-920a-3a66e1d5a1d1-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-9ds7z\" (UID: \"8bac5977-379a-4f6c-920a-3a66e1d5a1d1\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-9ds7z" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.140644 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v58xl\" (UniqueName: \"kubernetes.io/projected/8bac5977-379a-4f6c-920a-3a66e1d5a1d1-kube-api-access-v58xl\") pod \"cluster-monitoring-operator-6d5b84845-9ds7z\" (UID: \"8bac5977-379a-4f6c-920a-3a66e1d5a1d1\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-9ds7z" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.140700 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bac5977-379a-4f6c-920a-3a66e1d5a1d1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-9ds7z\" (UID: \"8bac5977-379a-4f6c-920a-3a66e1d5a1d1\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-9ds7z" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.141470 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8bac5977-379a-4f6c-920a-3a66e1d5a1d1-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-9ds7z\" (UID: \"8bac5977-379a-4f6c-920a-3a66e1d5a1d1\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-9ds7z" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.145032 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bac5977-379a-4f6c-920a-3a66e1d5a1d1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-9ds7z\" (UID: \"8bac5977-379a-4f6c-920a-3a66e1d5a1d1\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-9ds7z" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.152687 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v58xl\" (UniqueName: \"kubernetes.io/projected/8bac5977-379a-4f6c-920a-3a66e1d5a1d1-kube-api-access-v58xl\") pod \"cluster-monitoring-operator-6d5b84845-9ds7z\" (UID: \"8bac5977-379a-4f6c-920a-3a66e1d5a1d1\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-9ds7z" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.327143 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-9ds7z" Dec 09 15:03:07 crc kubenswrapper[4555]: I1209 15:03:07.660645 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-9ds7z"] Dec 09 15:03:08 crc kubenswrapper[4555]: I1209 15:03:08.509576 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-9ds7z" event={"ID":"8bac5977-379a-4f6c-920a-3a66e1d5a1d1","Type":"ContainerStarted","Data":"ef48434ddfbc34e0d223d9c3b9c0e2dbaaacfad72128e11124a49b68464d4ca8"} Dec 09 15:03:09 crc kubenswrapper[4555]: I1209 15:03:09.514573 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-9ds7z" event={"ID":"8bac5977-379a-4f6c-920a-3a66e1d5a1d1","Type":"ContainerStarted","Data":"406b38e6081c32980ef9fb21a34b903fda89b340e03c254b2476630b6db80836"} Dec 09 15:03:09 crc kubenswrapper[4555]: I1209 15:03:09.527193 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-9ds7z" podStartSLOduration=2.016332857 podStartE2EDuration="3.527180573s" podCreationTimestamp="2025-12-09 15:03:06 +0000 UTC" firstStartedPulling="2025-12-09 15:03:07.6655496 +0000 UTC m=+297.802262319" lastFinishedPulling="2025-12-09 15:03:09.176397317 +0000 UTC m=+299.313110035" observedRunningTime="2025-12-09 15:03:09.524659256 +0000 UTC m=+299.661371974" watchObservedRunningTime="2025-12-09 15:03:09.527180573 +0000 UTC m=+299.663893291" Dec 09 15:03:09 crc kubenswrapper[4555]: I1209 15:03:09.598593 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tnf48"] Dec 09 15:03:09 crc kubenswrapper[4555]: I1209 15:03:09.599680 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tnf48" Dec 09 15:03:09 crc kubenswrapper[4555]: I1209 15:03:09.603967 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tnf48"] Dec 09 15:03:09 crc kubenswrapper[4555]: I1209 15:03:09.604099 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Dec 09 15:03:09 crc kubenswrapper[4555]: I1209 15:03:09.669264 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/858a1ce9-f128-41e2-b6e0-3fabde0c8d33-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-tnf48\" (UID: \"858a1ce9-f128-41e2-b6e0-3fabde0c8d33\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tnf48" Dec 09 15:03:09 crc kubenswrapper[4555]: I1209 15:03:09.770158 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/858a1ce9-f128-41e2-b6e0-3fabde0c8d33-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-tnf48\" (UID: \"858a1ce9-f128-41e2-b6e0-3fabde0c8d33\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tnf48" Dec 09 15:03:09 crc kubenswrapper[4555]: E1209 15:03:09.770334 4555 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Dec 09 15:03:09 crc kubenswrapper[4555]: E1209 15:03:09.770386 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/858a1ce9-f128-41e2-b6e0-3fabde0c8d33-tls-certificates podName:858a1ce9-f128-41e2-b6e0-3fabde0c8d33 nodeName:}" failed. No retries permitted until 2025-12-09 15:03:10.270371223 +0000 UTC m=+300.407083941 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/858a1ce9-f128-41e2-b6e0-3fabde0c8d33-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-tnf48" (UID: "858a1ce9-f128-41e2-b6e0-3fabde0c8d33") : secret "prometheus-operator-admission-webhook-tls" not found Dec 09 15:03:09 crc kubenswrapper[4555]: I1209 15:03:09.893288 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-rf8rn"] Dec 09 15:03:09 crc kubenswrapper[4555]: I1209 15:03:09.893499 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-rf8rn" podUID="dae2070b-58ae-47b7-8609-3ca457ed809a" containerName="controller-manager" containerID="cri-o://28da5b30abf7efc9e301eca9bc2d4566e6101847965e922da990de0153d47328" gracePeriod=30 Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.028597 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6"] Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.028757 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6" podUID="d0f5603f-8e68-4cfb-9aa1-a6854b684172" containerName="route-controller-manager" containerID="cri-o://05fee7b2bfa9af8f8109fe4522cf82cee4fa3aea171cbb4d623eb4ea0d81f4fa" gracePeriod=30 Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.056007 4555 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.179874 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-rf8rn" Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.276746 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dae2070b-58ae-47b7-8609-3ca457ed809a-serving-cert\") pod \"dae2070b-58ae-47b7-8609-3ca457ed809a\" (UID: \"dae2070b-58ae-47b7-8609-3ca457ed809a\") " Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.276992 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-257hz\" (UniqueName: \"kubernetes.io/projected/dae2070b-58ae-47b7-8609-3ca457ed809a-kube-api-access-257hz\") pod \"dae2070b-58ae-47b7-8609-3ca457ed809a\" (UID: \"dae2070b-58ae-47b7-8609-3ca457ed809a\") " Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.277029 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dae2070b-58ae-47b7-8609-3ca457ed809a-client-ca\") pod \"dae2070b-58ae-47b7-8609-3ca457ed809a\" (UID: \"dae2070b-58ae-47b7-8609-3ca457ed809a\") " Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.277088 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dae2070b-58ae-47b7-8609-3ca457ed809a-config\") pod \"dae2070b-58ae-47b7-8609-3ca457ed809a\" (UID: \"dae2070b-58ae-47b7-8609-3ca457ed809a\") " Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.277145 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dae2070b-58ae-47b7-8609-3ca457ed809a-proxy-ca-bundles\") pod \"dae2070b-58ae-47b7-8609-3ca457ed809a\" (UID: \"dae2070b-58ae-47b7-8609-3ca457ed809a\") " Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.277383 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/858a1ce9-f128-41e2-b6e0-3fabde0c8d33-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-tnf48\" (UID: \"858a1ce9-f128-41e2-b6e0-3fabde0c8d33\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tnf48" Dec 09 15:03:10 crc kubenswrapper[4555]: E1209 15:03:10.277479 4555 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Dec 09 15:03:10 crc kubenswrapper[4555]: E1209 15:03:10.277520 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/858a1ce9-f128-41e2-b6e0-3fabde0c8d33-tls-certificates podName:858a1ce9-f128-41e2-b6e0-3fabde0c8d33 nodeName:}" failed. No retries permitted until 2025-12-09 15:03:11.277506699 +0000 UTC m=+301.414219417 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/858a1ce9-f128-41e2-b6e0-3fabde0c8d33-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-tnf48" (UID: "858a1ce9-f128-41e2-b6e0-3fabde0c8d33") : secret "prometheus-operator-admission-webhook-tls" not found Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.277955 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dae2070b-58ae-47b7-8609-3ca457ed809a-client-ca" (OuterVolumeSpecName: "client-ca") pod "dae2070b-58ae-47b7-8609-3ca457ed809a" (UID: "dae2070b-58ae-47b7-8609-3ca457ed809a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.277975 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dae2070b-58ae-47b7-8609-3ca457ed809a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "dae2070b-58ae-47b7-8609-3ca457ed809a" (UID: "dae2070b-58ae-47b7-8609-3ca457ed809a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.278209 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dae2070b-58ae-47b7-8609-3ca457ed809a-config" (OuterVolumeSpecName: "config") pod "dae2070b-58ae-47b7-8609-3ca457ed809a" (UID: "dae2070b-58ae-47b7-8609-3ca457ed809a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.284049 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dae2070b-58ae-47b7-8609-3ca457ed809a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "dae2070b-58ae-47b7-8609-3ca457ed809a" (UID: "dae2070b-58ae-47b7-8609-3ca457ed809a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.284128 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dae2070b-58ae-47b7-8609-3ca457ed809a-kube-api-access-257hz" (OuterVolumeSpecName: "kube-api-access-257hz") pod "dae2070b-58ae-47b7-8609-3ca457ed809a" (UID: "dae2070b-58ae-47b7-8609-3ca457ed809a"). InnerVolumeSpecName "kube-api-access-257hz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.307616 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6" Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.378016 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0f5603f-8e68-4cfb-9aa1-a6854b684172-config\") pod \"d0f5603f-8e68-4cfb-9aa1-a6854b684172\" (UID: \"d0f5603f-8e68-4cfb-9aa1-a6854b684172\") " Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.378063 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0f5603f-8e68-4cfb-9aa1-a6854b684172-serving-cert\") pod \"d0f5603f-8e68-4cfb-9aa1-a6854b684172\" (UID: \"d0f5603f-8e68-4cfb-9aa1-a6854b684172\") " Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.378123 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6lr5\" (UniqueName: \"kubernetes.io/projected/d0f5603f-8e68-4cfb-9aa1-a6854b684172-kube-api-access-v6lr5\") pod \"d0f5603f-8e68-4cfb-9aa1-a6854b684172\" (UID: \"d0f5603f-8e68-4cfb-9aa1-a6854b684172\") " Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.378151 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0f5603f-8e68-4cfb-9aa1-a6854b684172-client-ca\") pod \"d0f5603f-8e68-4cfb-9aa1-a6854b684172\" (UID: \"d0f5603f-8e68-4cfb-9aa1-a6854b684172\") " Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.378549 4555 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dae2070b-58ae-47b7-8609-3ca457ed809a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.378573 4555 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dae2070b-58ae-47b7-8609-3ca457ed809a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.378583 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-257hz\" (UniqueName: \"kubernetes.io/projected/dae2070b-58ae-47b7-8609-3ca457ed809a-kube-api-access-257hz\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.378596 4555 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dae2070b-58ae-47b7-8609-3ca457ed809a-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.378605 4555 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dae2070b-58ae-47b7-8609-3ca457ed809a-config\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.378736 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0f5603f-8e68-4cfb-9aa1-a6854b684172-client-ca" (OuterVolumeSpecName: "client-ca") pod "d0f5603f-8e68-4cfb-9aa1-a6854b684172" (UID: "d0f5603f-8e68-4cfb-9aa1-a6854b684172"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.378748 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0f5603f-8e68-4cfb-9aa1-a6854b684172-config" (OuterVolumeSpecName: "config") pod "d0f5603f-8e68-4cfb-9aa1-a6854b684172" (UID: "d0f5603f-8e68-4cfb-9aa1-a6854b684172"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.380737 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0f5603f-8e68-4cfb-9aa1-a6854b684172-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d0f5603f-8e68-4cfb-9aa1-a6854b684172" (UID: "d0f5603f-8e68-4cfb-9aa1-a6854b684172"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.380803 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0f5603f-8e68-4cfb-9aa1-a6854b684172-kube-api-access-v6lr5" (OuterVolumeSpecName: "kube-api-access-v6lr5") pod "d0f5603f-8e68-4cfb-9aa1-a6854b684172" (UID: "d0f5603f-8e68-4cfb-9aa1-a6854b684172"). InnerVolumeSpecName "kube-api-access-v6lr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.479774 4555 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0f5603f-8e68-4cfb-9aa1-a6854b684172-config\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.479800 4555 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0f5603f-8e68-4cfb-9aa1-a6854b684172-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.479812 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6lr5\" (UniqueName: \"kubernetes.io/projected/d0f5603f-8e68-4cfb-9aa1-a6854b684172-kube-api-access-v6lr5\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.479821 4555 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0f5603f-8e68-4cfb-9aa1-a6854b684172-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.519212 4555 generic.go:334] "Generic (PLEG): container finished" podID="dae2070b-58ae-47b7-8609-3ca457ed809a" containerID="28da5b30abf7efc9e301eca9bc2d4566e6101847965e922da990de0153d47328" exitCode=0 Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.519258 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-rf8rn" event={"ID":"dae2070b-58ae-47b7-8609-3ca457ed809a","Type":"ContainerDied","Data":"28da5b30abf7efc9e301eca9bc2d4566e6101847965e922da990de0153d47328"} Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.519275 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-rf8rn" Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.519326 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-rf8rn" event={"ID":"dae2070b-58ae-47b7-8609-3ca457ed809a","Type":"ContainerDied","Data":"df8b2176d8c6d2cf4e427ada2bfdcf8a050a4d35e607d5589965653a17529715"} Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.519355 4555 scope.go:117] "RemoveContainer" containerID="28da5b30abf7efc9e301eca9bc2d4566e6101847965e922da990de0153d47328" Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.520672 4555 generic.go:334] "Generic (PLEG): container finished" podID="d0f5603f-8e68-4cfb-9aa1-a6854b684172" containerID="05fee7b2bfa9af8f8109fe4522cf82cee4fa3aea171cbb4d623eb4ea0d81f4fa" exitCode=0 Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.520845 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6" event={"ID":"d0f5603f-8e68-4cfb-9aa1-a6854b684172","Type":"ContainerDied","Data":"05fee7b2bfa9af8f8109fe4522cf82cee4fa3aea171cbb4d623eb4ea0d81f4fa"} Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.520873 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6" event={"ID":"d0f5603f-8e68-4cfb-9aa1-a6854b684172","Type":"ContainerDied","Data":"2e194e8279b55b50ed2f0451d6c3122ce427b731c639a95d8944879b378b1e91"} Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.521220 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6" Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.530567 4555 scope.go:117] "RemoveContainer" containerID="28da5b30abf7efc9e301eca9bc2d4566e6101847965e922da990de0153d47328" Dec 09 15:03:10 crc kubenswrapper[4555]: E1209 15:03:10.530936 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28da5b30abf7efc9e301eca9bc2d4566e6101847965e922da990de0153d47328\": container with ID starting with 28da5b30abf7efc9e301eca9bc2d4566e6101847965e922da990de0153d47328 not found: ID does not exist" containerID="28da5b30abf7efc9e301eca9bc2d4566e6101847965e922da990de0153d47328" Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.530971 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28da5b30abf7efc9e301eca9bc2d4566e6101847965e922da990de0153d47328"} err="failed to get container status \"28da5b30abf7efc9e301eca9bc2d4566e6101847965e922da990de0153d47328\": rpc error: code = NotFound desc = could not find container \"28da5b30abf7efc9e301eca9bc2d4566e6101847965e922da990de0153d47328\": container with ID starting with 28da5b30abf7efc9e301eca9bc2d4566e6101847965e922da990de0153d47328 not found: ID does not exist" Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.530993 4555 scope.go:117] "RemoveContainer" containerID="05fee7b2bfa9af8f8109fe4522cf82cee4fa3aea171cbb4d623eb4ea0d81f4fa" Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.541162 4555 scope.go:117] "RemoveContainer" containerID="05fee7b2bfa9af8f8109fe4522cf82cee4fa3aea171cbb4d623eb4ea0d81f4fa" Dec 09 15:03:10 crc kubenswrapper[4555]: E1209 15:03:10.541481 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05fee7b2bfa9af8f8109fe4522cf82cee4fa3aea171cbb4d623eb4ea0d81f4fa\": container with ID starting with 05fee7b2bfa9af8f8109fe4522cf82cee4fa3aea171cbb4d623eb4ea0d81f4fa not found: ID does not exist" containerID="05fee7b2bfa9af8f8109fe4522cf82cee4fa3aea171cbb4d623eb4ea0d81f4fa" Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.541508 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05fee7b2bfa9af8f8109fe4522cf82cee4fa3aea171cbb4d623eb4ea0d81f4fa"} err="failed to get container status \"05fee7b2bfa9af8f8109fe4522cf82cee4fa3aea171cbb4d623eb4ea0d81f4fa\": rpc error: code = NotFound desc = could not find container \"05fee7b2bfa9af8f8109fe4522cf82cee4fa3aea171cbb4d623eb4ea0d81f4fa\": container with ID starting with 05fee7b2bfa9af8f8109fe4522cf82cee4fa3aea171cbb4d623eb4ea0d81f4fa not found: ID does not exist" Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.543626 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-rf8rn"] Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.546755 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-rf8rn"] Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.549423 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6"] Dec 09 15:03:10 crc kubenswrapper[4555]: I1209 15:03:10.551819 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qdpr6"] Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.286871 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/858a1ce9-f128-41e2-b6e0-3fabde0c8d33-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-tnf48\" (UID: \"858a1ce9-f128-41e2-b6e0-3fabde0c8d33\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tnf48" Dec 09 15:03:11 crc kubenswrapper[4555]: E1209 15:03:11.287017 4555 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Dec 09 15:03:11 crc kubenswrapper[4555]: E1209 15:03:11.287069 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/858a1ce9-f128-41e2-b6e0-3fabde0c8d33-tls-certificates podName:858a1ce9-f128-41e2-b6e0-3fabde0c8d33 nodeName:}" failed. No retries permitted until 2025-12-09 15:03:13.287055586 +0000 UTC m=+303.423768304 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/858a1ce9-f128-41e2-b6e0-3fabde0c8d33-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-tnf48" (UID: "858a1ce9-f128-41e2-b6e0-3fabde0c8d33") : secret "prometheus-operator-admission-webhook-tls" not found Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.832403 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2"] Dec 09 15:03:11 crc kubenswrapper[4555]: E1209 15:03:11.832589 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f5603f-8e68-4cfb-9aa1-a6854b684172" containerName="route-controller-manager" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.832600 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f5603f-8e68-4cfb-9aa1-a6854b684172" containerName="route-controller-manager" Dec 09 15:03:11 crc kubenswrapper[4555]: E1209 15:03:11.832615 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dae2070b-58ae-47b7-8609-3ca457ed809a" containerName="controller-manager" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.832620 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="dae2070b-58ae-47b7-8609-3ca457ed809a" containerName="controller-manager" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.832710 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0f5603f-8e68-4cfb-9aa1-a6854b684172" containerName="route-controller-manager" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.832719 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="dae2070b-58ae-47b7-8609-3ca457ed809a" containerName="controller-manager" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.833033 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.834355 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.834608 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.834948 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.834972 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.835016 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.835207 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.835283 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-749684855f-5jg87"] Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.836171 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-749684855f-5jg87" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.837383 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.837759 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.837927 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.837957 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.839157 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.839191 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.840148 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2"] Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.843277 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-749684855f-5jg87"] Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.843671 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.893852 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58c6e436-b767-4a78-aa21-9f8466d28b3e-config\") pod \"controller-manager-749684855f-5jg87\" (UID: \"58c6e436-b767-4a78-aa21-9f8466d28b3e\") " pod="openshift-controller-manager/controller-manager-749684855f-5jg87" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.893885 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2477\" (UniqueName: \"kubernetes.io/projected/f5feebc2-24be-46a3-a5d7-6e605bc6a073-kube-api-access-m2477\") pod \"route-controller-manager-6445bb5c56-sxsz2\" (UID: \"f5feebc2-24be-46a3-a5d7-6e605bc6a073\") " pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.893912 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/58c6e436-b767-4a78-aa21-9f8466d28b3e-proxy-ca-bundles\") pod \"controller-manager-749684855f-5jg87\" (UID: \"58c6e436-b767-4a78-aa21-9f8466d28b3e\") " pod="openshift-controller-manager/controller-manager-749684855f-5jg87" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.893942 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5feebc2-24be-46a3-a5d7-6e605bc6a073-serving-cert\") pod \"route-controller-manager-6445bb5c56-sxsz2\" (UID: \"f5feebc2-24be-46a3-a5d7-6e605bc6a073\") " pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.893961 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5feebc2-24be-46a3-a5d7-6e605bc6a073-config\") pod \"route-controller-manager-6445bb5c56-sxsz2\" (UID: \"f5feebc2-24be-46a3-a5d7-6e605bc6a073\") " pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.894127 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58c6e436-b767-4a78-aa21-9f8466d28b3e-client-ca\") pod \"controller-manager-749684855f-5jg87\" (UID: \"58c6e436-b767-4a78-aa21-9f8466d28b3e\") " pod="openshift-controller-manager/controller-manager-749684855f-5jg87" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.894219 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5rgm\" (UniqueName: \"kubernetes.io/projected/58c6e436-b767-4a78-aa21-9f8466d28b3e-kube-api-access-h5rgm\") pod \"controller-manager-749684855f-5jg87\" (UID: \"58c6e436-b767-4a78-aa21-9f8466d28b3e\") " pod="openshift-controller-manager/controller-manager-749684855f-5jg87" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.894257 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58c6e436-b767-4a78-aa21-9f8466d28b3e-serving-cert\") pod \"controller-manager-749684855f-5jg87\" (UID: \"58c6e436-b767-4a78-aa21-9f8466d28b3e\") " pod="openshift-controller-manager/controller-manager-749684855f-5jg87" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.894280 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5feebc2-24be-46a3-a5d7-6e605bc6a073-client-ca\") pod \"route-controller-manager-6445bb5c56-sxsz2\" (UID: \"f5feebc2-24be-46a3-a5d7-6e605bc6a073\") " pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.995162 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5feebc2-24be-46a3-a5d7-6e605bc6a073-config\") pod \"route-controller-manager-6445bb5c56-sxsz2\" (UID: \"f5feebc2-24be-46a3-a5d7-6e605bc6a073\") " pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.995217 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58c6e436-b767-4a78-aa21-9f8466d28b3e-client-ca\") pod \"controller-manager-749684855f-5jg87\" (UID: \"58c6e436-b767-4a78-aa21-9f8466d28b3e\") " pod="openshift-controller-manager/controller-manager-749684855f-5jg87" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.995255 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5rgm\" (UniqueName: \"kubernetes.io/projected/58c6e436-b767-4a78-aa21-9f8466d28b3e-kube-api-access-h5rgm\") pod \"controller-manager-749684855f-5jg87\" (UID: \"58c6e436-b767-4a78-aa21-9f8466d28b3e\") " pod="openshift-controller-manager/controller-manager-749684855f-5jg87" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.995274 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58c6e436-b767-4a78-aa21-9f8466d28b3e-serving-cert\") pod \"controller-manager-749684855f-5jg87\" (UID: \"58c6e436-b767-4a78-aa21-9f8466d28b3e\") " pod="openshift-controller-manager/controller-manager-749684855f-5jg87" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.995290 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5feebc2-24be-46a3-a5d7-6e605bc6a073-client-ca\") pod \"route-controller-manager-6445bb5c56-sxsz2\" (UID: \"f5feebc2-24be-46a3-a5d7-6e605bc6a073\") " pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.995330 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58c6e436-b767-4a78-aa21-9f8466d28b3e-config\") pod \"controller-manager-749684855f-5jg87\" (UID: \"58c6e436-b767-4a78-aa21-9f8466d28b3e\") " pod="openshift-controller-manager/controller-manager-749684855f-5jg87" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.995346 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2477\" (UniqueName: \"kubernetes.io/projected/f5feebc2-24be-46a3-a5d7-6e605bc6a073-kube-api-access-m2477\") pod \"route-controller-manager-6445bb5c56-sxsz2\" (UID: \"f5feebc2-24be-46a3-a5d7-6e605bc6a073\") " pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.995367 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/58c6e436-b767-4a78-aa21-9f8466d28b3e-proxy-ca-bundles\") pod \"controller-manager-749684855f-5jg87\" (UID: \"58c6e436-b767-4a78-aa21-9f8466d28b3e\") " pod="openshift-controller-manager/controller-manager-749684855f-5jg87" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.995392 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5feebc2-24be-46a3-a5d7-6e605bc6a073-serving-cert\") pod \"route-controller-manager-6445bb5c56-sxsz2\" (UID: \"f5feebc2-24be-46a3-a5d7-6e605bc6a073\") " pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.996143 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58c6e436-b767-4a78-aa21-9f8466d28b3e-client-ca\") pod \"controller-manager-749684855f-5jg87\" (UID: \"58c6e436-b767-4a78-aa21-9f8466d28b3e\") " pod="openshift-controller-manager/controller-manager-749684855f-5jg87" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.996358 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5feebc2-24be-46a3-a5d7-6e605bc6a073-config\") pod \"route-controller-manager-6445bb5c56-sxsz2\" (UID: \"f5feebc2-24be-46a3-a5d7-6e605bc6a073\") " pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.996503 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58c6e436-b767-4a78-aa21-9f8466d28b3e-config\") pod \"controller-manager-749684855f-5jg87\" (UID: \"58c6e436-b767-4a78-aa21-9f8466d28b3e\") " pod="openshift-controller-manager/controller-manager-749684855f-5jg87" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.996543 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5feebc2-24be-46a3-a5d7-6e605bc6a073-client-ca\") pod \"route-controller-manager-6445bb5c56-sxsz2\" (UID: \"f5feebc2-24be-46a3-a5d7-6e605bc6a073\") " pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.996866 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/58c6e436-b767-4a78-aa21-9f8466d28b3e-proxy-ca-bundles\") pod \"controller-manager-749684855f-5jg87\" (UID: \"58c6e436-b767-4a78-aa21-9f8466d28b3e\") " pod="openshift-controller-manager/controller-manager-749684855f-5jg87" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.998563 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58c6e436-b767-4a78-aa21-9f8466d28b3e-serving-cert\") pod \"controller-manager-749684855f-5jg87\" (UID: \"58c6e436-b767-4a78-aa21-9f8466d28b3e\") " pod="openshift-controller-manager/controller-manager-749684855f-5jg87" Dec 09 15:03:11 crc kubenswrapper[4555]: I1209 15:03:11.999223 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5feebc2-24be-46a3-a5d7-6e605bc6a073-serving-cert\") pod \"route-controller-manager-6445bb5c56-sxsz2\" (UID: \"f5feebc2-24be-46a3-a5d7-6e605bc6a073\") " pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2" Dec 09 15:03:12 crc kubenswrapper[4555]: I1209 15:03:12.007536 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5rgm\" (UniqueName: \"kubernetes.io/projected/58c6e436-b767-4a78-aa21-9f8466d28b3e-kube-api-access-h5rgm\") pod \"controller-manager-749684855f-5jg87\" (UID: \"58c6e436-b767-4a78-aa21-9f8466d28b3e\") " pod="openshift-controller-manager/controller-manager-749684855f-5jg87" Dec 09 15:03:12 crc kubenswrapper[4555]: I1209 15:03:12.007701 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2477\" (UniqueName: \"kubernetes.io/projected/f5feebc2-24be-46a3-a5d7-6e605bc6a073-kube-api-access-m2477\") pod \"route-controller-manager-6445bb5c56-sxsz2\" (UID: \"f5feebc2-24be-46a3-a5d7-6e605bc6a073\") " pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2" Dec 09 15:03:12 crc kubenswrapper[4555]: I1209 15:03:12.144289 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2" Dec 09 15:03:12 crc kubenswrapper[4555]: I1209 15:03:12.150144 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-749684855f-5jg87" Dec 09 15:03:12 crc kubenswrapper[4555]: I1209 15:03:12.166575 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0f5603f-8e68-4cfb-9aa1-a6854b684172" path="/var/lib/kubelet/pods/d0f5603f-8e68-4cfb-9aa1-a6854b684172/volumes" Dec 09 15:03:12 crc kubenswrapper[4555]: I1209 15:03:12.167211 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dae2070b-58ae-47b7-8609-3ca457ed809a" path="/var/lib/kubelet/pods/dae2070b-58ae-47b7-8609-3ca457ed809a/volumes" Dec 09 15:03:12 crc kubenswrapper[4555]: I1209 15:03:12.301000 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-749684855f-5jg87"] Dec 09 15:03:12 crc kubenswrapper[4555]: I1209 15:03:12.479597 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2"] Dec 09 15:03:12 crc kubenswrapper[4555]: W1209 15:03:12.482469 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5feebc2_24be_46a3_a5d7_6e605bc6a073.slice/crio-08cdcdb25614adb9a7ef95bc31cd2b9a17890664c37b0f4ffbc9cbf8a658c1e0 WatchSource:0}: Error finding container 08cdcdb25614adb9a7ef95bc31cd2b9a17890664c37b0f4ffbc9cbf8a658c1e0: Status 404 returned error can't find the container with id 08cdcdb25614adb9a7ef95bc31cd2b9a17890664c37b0f4ffbc9cbf8a658c1e0 Dec 09 15:03:12 crc kubenswrapper[4555]: I1209 15:03:12.532811 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-749684855f-5jg87" event={"ID":"58c6e436-b767-4a78-aa21-9f8466d28b3e","Type":"ContainerStarted","Data":"14dc01adfa6873faf4ab6b35bc678af60dabb3d70f5e76339611d0682e0bcf6a"} Dec 09 15:03:12 crc kubenswrapper[4555]: I1209 15:03:12.532853 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-749684855f-5jg87" event={"ID":"58c6e436-b767-4a78-aa21-9f8466d28b3e","Type":"ContainerStarted","Data":"23db9dcbb276d334aaa17e9ad58dba0a99d834cb369ee70e0409903deb88288c"} Dec 09 15:03:12 crc kubenswrapper[4555]: I1209 15:03:12.533350 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-749684855f-5jg87" Dec 09 15:03:12 crc kubenswrapper[4555]: I1209 15:03:12.534141 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2" event={"ID":"f5feebc2-24be-46a3-a5d7-6e605bc6a073","Type":"ContainerStarted","Data":"08cdcdb25614adb9a7ef95bc31cd2b9a17890664c37b0f4ffbc9cbf8a658c1e0"} Dec 09 15:03:12 crc kubenswrapper[4555]: I1209 15:03:12.547701 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-749684855f-5jg87" Dec 09 15:03:12 crc kubenswrapper[4555]: I1209 15:03:12.547886 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-749684855f-5jg87" podStartSLOduration=3.547876312 podStartE2EDuration="3.547876312s" podCreationTimestamp="2025-12-09 15:03:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:03:12.544679526 +0000 UTC m=+302.681392245" watchObservedRunningTime="2025-12-09 15:03:12.547876312 +0000 UTC m=+302.684589030" Dec 09 15:03:13 crc kubenswrapper[4555]: I1209 15:03:13.313879 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/858a1ce9-f128-41e2-b6e0-3fabde0c8d33-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-tnf48\" (UID: \"858a1ce9-f128-41e2-b6e0-3fabde0c8d33\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tnf48" Dec 09 15:03:13 crc kubenswrapper[4555]: E1209 15:03:13.313967 4555 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Dec 09 15:03:13 crc kubenswrapper[4555]: E1209 15:03:13.314033 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/858a1ce9-f128-41e2-b6e0-3fabde0c8d33-tls-certificates podName:858a1ce9-f128-41e2-b6e0-3fabde0c8d33 nodeName:}" failed. No retries permitted until 2025-12-09 15:03:17.314017348 +0000 UTC m=+307.450730065 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/858a1ce9-f128-41e2-b6e0-3fabde0c8d33-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-tnf48" (UID: "858a1ce9-f128-41e2-b6e0-3fabde0c8d33") : secret "prometheus-operator-admission-webhook-tls" not found Dec 09 15:03:13 crc kubenswrapper[4555]: I1209 15:03:13.538804 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2" event={"ID":"f5feebc2-24be-46a3-a5d7-6e605bc6a073","Type":"ContainerStarted","Data":"a3d3d4014799e8dd6553ece54019fa164d1b6de366574c3c34720d4aff7aae54"} Dec 09 15:03:13 crc kubenswrapper[4555]: I1209 15:03:13.551003 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2" podStartSLOduration=3.550989949 podStartE2EDuration="3.550989949s" podCreationTimestamp="2025-12-09 15:03:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:03:13.550795 +0000 UTC m=+303.687507717" watchObservedRunningTime="2025-12-09 15:03:13.550989949 +0000 UTC m=+303.687702667" Dec 09 15:03:14 crc kubenswrapper[4555]: I1209 15:03:14.543103 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2" Dec 09 15:03:14 crc kubenswrapper[4555]: I1209 15:03:14.546854 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2" Dec 09 15:03:17 crc kubenswrapper[4555]: I1209 15:03:17.355862 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/858a1ce9-f128-41e2-b6e0-3fabde0c8d33-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-tnf48\" (UID: \"858a1ce9-f128-41e2-b6e0-3fabde0c8d33\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tnf48" Dec 09 15:03:17 crc kubenswrapper[4555]: E1209 15:03:17.356009 4555 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Dec 09 15:03:17 crc kubenswrapper[4555]: E1209 15:03:17.356237 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/858a1ce9-f128-41e2-b6e0-3fabde0c8d33-tls-certificates podName:858a1ce9-f128-41e2-b6e0-3fabde0c8d33 nodeName:}" failed. No retries permitted until 2025-12-09 15:03:25.35622109 +0000 UTC m=+315.492933807 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/858a1ce9-f128-41e2-b6e0-3fabde0c8d33-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-tnf48" (UID: "858a1ce9-f128-41e2-b6e0-3fabde0c8d33") : secret "prometheus-operator-admission-webhook-tls" not found Dec 09 15:03:18 crc kubenswrapper[4555]: I1209 15:03:18.272964 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fflnp"] Dec 09 15:03:18 crc kubenswrapper[4555]: I1209 15:03:18.273851 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fflnp" Dec 09 15:03:18 crc kubenswrapper[4555]: I1209 15:03:18.275330 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 09 15:03:18 crc kubenswrapper[4555]: I1209 15:03:18.282092 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fflnp"] Dec 09 15:03:18 crc kubenswrapper[4555]: I1209 15:03:18.365630 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b25e4e41-17e1-4ca8-8761-d953a49b0f22-catalog-content\") pod \"redhat-operators-fflnp\" (UID: \"b25e4e41-17e1-4ca8-8761-d953a49b0f22\") " pod="openshift-marketplace/redhat-operators-fflnp" Dec 09 15:03:18 crc kubenswrapper[4555]: I1209 15:03:18.365678 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5997s\" (UniqueName: \"kubernetes.io/projected/b25e4e41-17e1-4ca8-8761-d953a49b0f22-kube-api-access-5997s\") pod \"redhat-operators-fflnp\" (UID: \"b25e4e41-17e1-4ca8-8761-d953a49b0f22\") " pod="openshift-marketplace/redhat-operators-fflnp" Dec 09 15:03:18 crc kubenswrapper[4555]: I1209 15:03:18.365699 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b25e4e41-17e1-4ca8-8761-d953a49b0f22-utilities\") pod \"redhat-operators-fflnp\" (UID: \"b25e4e41-17e1-4ca8-8761-d953a49b0f22\") " pod="openshift-marketplace/redhat-operators-fflnp" Dec 09 15:03:18 crc kubenswrapper[4555]: I1209 15:03:18.466280 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b25e4e41-17e1-4ca8-8761-d953a49b0f22-catalog-content\") pod \"redhat-operators-fflnp\" (UID: \"b25e4e41-17e1-4ca8-8761-d953a49b0f22\") " pod="openshift-marketplace/redhat-operators-fflnp" Dec 09 15:03:18 crc kubenswrapper[4555]: I1209 15:03:18.466346 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5997s\" (UniqueName: \"kubernetes.io/projected/b25e4e41-17e1-4ca8-8761-d953a49b0f22-kube-api-access-5997s\") pod \"redhat-operators-fflnp\" (UID: \"b25e4e41-17e1-4ca8-8761-d953a49b0f22\") " pod="openshift-marketplace/redhat-operators-fflnp" Dec 09 15:03:18 crc kubenswrapper[4555]: I1209 15:03:18.466368 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b25e4e41-17e1-4ca8-8761-d953a49b0f22-utilities\") pod \"redhat-operators-fflnp\" (UID: \"b25e4e41-17e1-4ca8-8761-d953a49b0f22\") " pod="openshift-marketplace/redhat-operators-fflnp" Dec 09 15:03:18 crc kubenswrapper[4555]: I1209 15:03:18.466943 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b25e4e41-17e1-4ca8-8761-d953a49b0f22-utilities\") pod \"redhat-operators-fflnp\" (UID: \"b25e4e41-17e1-4ca8-8761-d953a49b0f22\") " pod="openshift-marketplace/redhat-operators-fflnp" Dec 09 15:03:18 crc kubenswrapper[4555]: I1209 15:03:18.467140 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b25e4e41-17e1-4ca8-8761-d953a49b0f22-catalog-content\") pod \"redhat-operators-fflnp\" (UID: \"b25e4e41-17e1-4ca8-8761-d953a49b0f22\") " pod="openshift-marketplace/redhat-operators-fflnp" Dec 09 15:03:18 crc kubenswrapper[4555]: I1209 15:03:18.471777 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k9bn7"] Dec 09 15:03:18 crc kubenswrapper[4555]: I1209 15:03:18.472645 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k9bn7" Dec 09 15:03:18 crc kubenswrapper[4555]: I1209 15:03:18.474135 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 09 15:03:18 crc kubenswrapper[4555]: I1209 15:03:18.478222 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k9bn7"] Dec 09 15:03:18 crc kubenswrapper[4555]: I1209 15:03:18.485106 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5997s\" (UniqueName: \"kubernetes.io/projected/b25e4e41-17e1-4ca8-8761-d953a49b0f22-kube-api-access-5997s\") pod \"redhat-operators-fflnp\" (UID: \"b25e4e41-17e1-4ca8-8761-d953a49b0f22\") " pod="openshift-marketplace/redhat-operators-fflnp" Dec 09 15:03:18 crc kubenswrapper[4555]: I1209 15:03:18.567913 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkcmv\" (UniqueName: \"kubernetes.io/projected/ca7b78bf-7f50-4515-98b8-f7581c0e90d1-kube-api-access-rkcmv\") pod \"certified-operators-k9bn7\" (UID: \"ca7b78bf-7f50-4515-98b8-f7581c0e90d1\") " pod="openshift-marketplace/certified-operators-k9bn7" Dec 09 15:03:18 crc kubenswrapper[4555]: I1209 15:03:18.568026 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca7b78bf-7f50-4515-98b8-f7581c0e90d1-catalog-content\") pod \"certified-operators-k9bn7\" (UID: \"ca7b78bf-7f50-4515-98b8-f7581c0e90d1\") " pod="openshift-marketplace/certified-operators-k9bn7" Dec 09 15:03:18 crc kubenswrapper[4555]: I1209 15:03:18.568180 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca7b78bf-7f50-4515-98b8-f7581c0e90d1-utilities\") pod \"certified-operators-k9bn7\" (UID: \"ca7b78bf-7f50-4515-98b8-f7581c0e90d1\") " pod="openshift-marketplace/certified-operators-k9bn7" Dec 09 15:03:18 crc kubenswrapper[4555]: I1209 15:03:18.586840 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fflnp" Dec 09 15:03:18 crc kubenswrapper[4555]: I1209 15:03:18.669578 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkcmv\" (UniqueName: \"kubernetes.io/projected/ca7b78bf-7f50-4515-98b8-f7581c0e90d1-kube-api-access-rkcmv\") pod \"certified-operators-k9bn7\" (UID: \"ca7b78bf-7f50-4515-98b8-f7581c0e90d1\") " pod="openshift-marketplace/certified-operators-k9bn7" Dec 09 15:03:18 crc kubenswrapper[4555]: I1209 15:03:18.669817 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca7b78bf-7f50-4515-98b8-f7581c0e90d1-catalog-content\") pod \"certified-operators-k9bn7\" (UID: \"ca7b78bf-7f50-4515-98b8-f7581c0e90d1\") " pod="openshift-marketplace/certified-operators-k9bn7" Dec 09 15:03:18 crc kubenswrapper[4555]: I1209 15:03:18.669839 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca7b78bf-7f50-4515-98b8-f7581c0e90d1-utilities\") pod \"certified-operators-k9bn7\" (UID: \"ca7b78bf-7f50-4515-98b8-f7581c0e90d1\") " pod="openshift-marketplace/certified-operators-k9bn7" Dec 09 15:03:18 crc kubenswrapper[4555]: I1209 15:03:18.670255 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca7b78bf-7f50-4515-98b8-f7581c0e90d1-utilities\") pod \"certified-operators-k9bn7\" (UID: \"ca7b78bf-7f50-4515-98b8-f7581c0e90d1\") " pod="openshift-marketplace/certified-operators-k9bn7" Dec 09 15:03:18 crc kubenswrapper[4555]: I1209 15:03:18.670342 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca7b78bf-7f50-4515-98b8-f7581c0e90d1-catalog-content\") pod \"certified-operators-k9bn7\" (UID: \"ca7b78bf-7f50-4515-98b8-f7581c0e90d1\") " pod="openshift-marketplace/certified-operators-k9bn7" Dec 09 15:03:18 crc kubenswrapper[4555]: I1209 15:03:18.683954 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkcmv\" (UniqueName: \"kubernetes.io/projected/ca7b78bf-7f50-4515-98b8-f7581c0e90d1-kube-api-access-rkcmv\") pod \"certified-operators-k9bn7\" (UID: \"ca7b78bf-7f50-4515-98b8-f7581c0e90d1\") " pod="openshift-marketplace/certified-operators-k9bn7" Dec 09 15:03:18 crc kubenswrapper[4555]: I1209 15:03:18.783743 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k9bn7" Dec 09 15:03:18 crc kubenswrapper[4555]: I1209 15:03:18.915129 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fflnp"] Dec 09 15:03:18 crc kubenswrapper[4555]: W1209 15:03:18.921530 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb25e4e41_17e1_4ca8_8761_d953a49b0f22.slice/crio-78d20abef45b12f7f833e482ee12c97f9bc67c17569c80e47321da2eb9775e2d WatchSource:0}: Error finding container 78d20abef45b12f7f833e482ee12c97f9bc67c17569c80e47321da2eb9775e2d: Status 404 returned error can't find the container with id 78d20abef45b12f7f833e482ee12c97f9bc67c17569c80e47321da2eb9775e2d Dec 09 15:03:19 crc kubenswrapper[4555]: I1209 15:03:19.121224 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k9bn7"] Dec 09 15:03:19 crc kubenswrapper[4555]: W1209 15:03:19.124821 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca7b78bf_7f50_4515_98b8_f7581c0e90d1.slice/crio-af6cf06a62caac8108a759564f54ce99c290975d7388af6d58a436164e3573db WatchSource:0}: Error finding container af6cf06a62caac8108a759564f54ce99c290975d7388af6d58a436164e3573db: Status 404 returned error can't find the container with id af6cf06a62caac8108a759564f54ce99c290975d7388af6d58a436164e3573db Dec 09 15:03:19 crc kubenswrapper[4555]: I1209 15:03:19.563097 4555 generic.go:334] "Generic (PLEG): container finished" podID="ca7b78bf-7f50-4515-98b8-f7581c0e90d1" containerID="8228113bc1e5aff32178e912224611ae4996757d80978c62aad8b24e2dc8d586" exitCode=0 Dec 09 15:03:19 crc kubenswrapper[4555]: I1209 15:03:19.563142 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9bn7" event={"ID":"ca7b78bf-7f50-4515-98b8-f7581c0e90d1","Type":"ContainerDied","Data":"8228113bc1e5aff32178e912224611ae4996757d80978c62aad8b24e2dc8d586"} Dec 09 15:03:19 crc kubenswrapper[4555]: I1209 15:03:19.563182 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9bn7" event={"ID":"ca7b78bf-7f50-4515-98b8-f7581c0e90d1","Type":"ContainerStarted","Data":"af6cf06a62caac8108a759564f54ce99c290975d7388af6d58a436164e3573db"} Dec 09 15:03:19 crc kubenswrapper[4555]: I1209 15:03:19.564327 4555 generic.go:334] "Generic (PLEG): container finished" podID="b25e4e41-17e1-4ca8-8761-d953a49b0f22" containerID="2ecaf611ae3eea4260273d2ccb5c562bd62e426724688470792a83b962fb012e" exitCode=0 Dec 09 15:03:19 crc kubenswrapper[4555]: I1209 15:03:19.564354 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fflnp" event={"ID":"b25e4e41-17e1-4ca8-8761-d953a49b0f22","Type":"ContainerDied","Data":"2ecaf611ae3eea4260273d2ccb5c562bd62e426724688470792a83b962fb012e"} Dec 09 15:03:19 crc kubenswrapper[4555]: I1209 15:03:19.564378 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fflnp" event={"ID":"b25e4e41-17e1-4ca8-8761-d953a49b0f22","Type":"ContainerStarted","Data":"78d20abef45b12f7f833e482ee12c97f9bc67c17569c80e47321da2eb9775e2d"} Dec 09 15:03:19 crc kubenswrapper[4555]: I1209 15:03:19.738563 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2"] Dec 09 15:03:19 crc kubenswrapper[4555]: I1209 15:03:19.738720 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2" podUID="f5feebc2-24be-46a3-a5d7-6e605bc6a073" containerName="route-controller-manager" containerID="cri-o://a3d3d4014799e8dd6553ece54019fa164d1b6de366574c3c34720d4aff7aae54" gracePeriod=30 Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.065358 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.187609 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2477\" (UniqueName: \"kubernetes.io/projected/f5feebc2-24be-46a3-a5d7-6e605bc6a073-kube-api-access-m2477\") pod \"f5feebc2-24be-46a3-a5d7-6e605bc6a073\" (UID: \"f5feebc2-24be-46a3-a5d7-6e605bc6a073\") " Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.187650 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5feebc2-24be-46a3-a5d7-6e605bc6a073-client-ca\") pod \"f5feebc2-24be-46a3-a5d7-6e605bc6a073\" (UID: \"f5feebc2-24be-46a3-a5d7-6e605bc6a073\") " Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.187680 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5feebc2-24be-46a3-a5d7-6e605bc6a073-config\") pod \"f5feebc2-24be-46a3-a5d7-6e605bc6a073\" (UID: \"f5feebc2-24be-46a3-a5d7-6e605bc6a073\") " Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.187732 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5feebc2-24be-46a3-a5d7-6e605bc6a073-serving-cert\") pod \"f5feebc2-24be-46a3-a5d7-6e605bc6a073\" (UID: \"f5feebc2-24be-46a3-a5d7-6e605bc6a073\") " Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.188538 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5feebc2-24be-46a3-a5d7-6e605bc6a073-client-ca" (OuterVolumeSpecName: "client-ca") pod "f5feebc2-24be-46a3-a5d7-6e605bc6a073" (UID: "f5feebc2-24be-46a3-a5d7-6e605bc6a073"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.188707 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5feebc2-24be-46a3-a5d7-6e605bc6a073-config" (OuterVolumeSpecName: "config") pod "f5feebc2-24be-46a3-a5d7-6e605bc6a073" (UID: "f5feebc2-24be-46a3-a5d7-6e605bc6a073"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.192870 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5feebc2-24be-46a3-a5d7-6e605bc6a073-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f5feebc2-24be-46a3-a5d7-6e605bc6a073" (UID: "f5feebc2-24be-46a3-a5d7-6e605bc6a073"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.192870 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5feebc2-24be-46a3-a5d7-6e605bc6a073-kube-api-access-m2477" (OuterVolumeSpecName: "kube-api-access-m2477") pod "f5feebc2-24be-46a3-a5d7-6e605bc6a073" (UID: "f5feebc2-24be-46a3-a5d7-6e605bc6a073"). InnerVolumeSpecName "kube-api-access-m2477". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.288354 4555 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5feebc2-24be-46a3-a5d7-6e605bc6a073-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.288379 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2477\" (UniqueName: \"kubernetes.io/projected/f5feebc2-24be-46a3-a5d7-6e605bc6a073-kube-api-access-m2477\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.288388 4555 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5feebc2-24be-46a3-a5d7-6e605bc6a073-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.288396 4555 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5feebc2-24be-46a3-a5d7-6e605bc6a073-config\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.570097 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fflnp" event={"ID":"b25e4e41-17e1-4ca8-8761-d953a49b0f22","Type":"ContainerStarted","Data":"6c5bcbc7a2374bffc75a184e4a2488d9a4a3429d4474a1c345e94a02c6c9c863"} Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.572195 4555 generic.go:334] "Generic (PLEG): container finished" podID="ca7b78bf-7f50-4515-98b8-f7581c0e90d1" containerID="6df6f16213a34e8b4b695757909c991683f0ccdc9ef72d7078c03842473c2a90" exitCode=0 Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.572331 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9bn7" event={"ID":"ca7b78bf-7f50-4515-98b8-f7581c0e90d1","Type":"ContainerDied","Data":"6df6f16213a34e8b4b695757909c991683f0ccdc9ef72d7078c03842473c2a90"} Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.575831 4555 generic.go:334] "Generic (PLEG): container finished" podID="f5feebc2-24be-46a3-a5d7-6e605bc6a073" containerID="a3d3d4014799e8dd6553ece54019fa164d1b6de366574c3c34720d4aff7aae54" exitCode=0 Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.575874 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.575879 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2" event={"ID":"f5feebc2-24be-46a3-a5d7-6e605bc6a073","Type":"ContainerDied","Data":"a3d3d4014799e8dd6553ece54019fa164d1b6de366574c3c34720d4aff7aae54"} Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.577042 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2" event={"ID":"f5feebc2-24be-46a3-a5d7-6e605bc6a073","Type":"ContainerDied","Data":"08cdcdb25614adb9a7ef95bc31cd2b9a17890664c37b0f4ffbc9cbf8a658c1e0"} Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.577132 4555 scope.go:117] "RemoveContainer" containerID="a3d3d4014799e8dd6553ece54019fa164d1b6de366574c3c34720d4aff7aae54" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.592912 4555 scope.go:117] "RemoveContainer" containerID="a3d3d4014799e8dd6553ece54019fa164d1b6de366574c3c34720d4aff7aae54" Dec 09 15:03:20 crc kubenswrapper[4555]: E1209 15:03:20.593179 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3d3d4014799e8dd6553ece54019fa164d1b6de366574c3c34720d4aff7aae54\": container with ID starting with a3d3d4014799e8dd6553ece54019fa164d1b6de366574c3c34720d4aff7aae54 not found: ID does not exist" containerID="a3d3d4014799e8dd6553ece54019fa164d1b6de366574c3c34720d4aff7aae54" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.593207 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3d3d4014799e8dd6553ece54019fa164d1b6de366574c3c34720d4aff7aae54"} err="failed to get container status \"a3d3d4014799e8dd6553ece54019fa164d1b6de366574c3c34720d4aff7aae54\": rpc error: code = NotFound desc = could not find container \"a3d3d4014799e8dd6553ece54019fa164d1b6de366574c3c34720d4aff7aae54\": container with ID starting with a3d3d4014799e8dd6553ece54019fa164d1b6de366574c3c34720d4aff7aae54 not found: ID does not exist" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.607025 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2"] Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.609719 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6445bb5c56-sxsz2"] Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.671281 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qqpmg"] Dec 09 15:03:20 crc kubenswrapper[4555]: E1209 15:03:20.671589 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5feebc2-24be-46a3-a5d7-6e605bc6a073" containerName="route-controller-manager" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.671665 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5feebc2-24be-46a3-a5d7-6e605bc6a073" containerName="route-controller-manager" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.671816 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5feebc2-24be-46a3-a5d7-6e605bc6a073" containerName="route-controller-manager" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.672462 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qqpmg" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.674662 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.680888 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qqpmg"] Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.792970 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szqcp\" (UniqueName: \"kubernetes.io/projected/523fa32b-68ac-491d-8246-d57c80030415-kube-api-access-szqcp\") pod \"community-operators-qqpmg\" (UID: \"523fa32b-68ac-491d-8246-d57c80030415\") " pod="openshift-marketplace/community-operators-qqpmg" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.793030 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/523fa32b-68ac-491d-8246-d57c80030415-catalog-content\") pod \"community-operators-qqpmg\" (UID: \"523fa32b-68ac-491d-8246-d57c80030415\") " pod="openshift-marketplace/community-operators-qqpmg" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.793063 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/523fa32b-68ac-491d-8246-d57c80030415-utilities\") pod \"community-operators-qqpmg\" (UID: \"523fa32b-68ac-491d-8246-d57c80030415\") " pod="openshift-marketplace/community-operators-qqpmg" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.839823 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r"] Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.840444 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.842455 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.842472 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.842605 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.842689 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.842750 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.843654 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.849508 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r"] Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.871035 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jsb8t"] Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.871956 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jsb8t" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.873370 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.881717 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jsb8t"] Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.894162 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szqcp\" (UniqueName: \"kubernetes.io/projected/523fa32b-68ac-491d-8246-d57c80030415-kube-api-access-szqcp\") pod \"community-operators-qqpmg\" (UID: \"523fa32b-68ac-491d-8246-d57c80030415\") " pod="openshift-marketplace/community-operators-qqpmg" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.894210 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/808bc873-7767-43d8-9ca3-7602f659dac6-client-ca\") pod \"route-controller-manager-75fbfcbdd9-dh49r\" (UID: \"808bc873-7767-43d8-9ca3-7602f659dac6\") " pod="openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.894235 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/808bc873-7767-43d8-9ca3-7602f659dac6-serving-cert\") pod \"route-controller-manager-75fbfcbdd9-dh49r\" (UID: \"808bc873-7767-43d8-9ca3-7602f659dac6\") " pod="openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.894261 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88tpk\" (UniqueName: \"kubernetes.io/projected/808bc873-7767-43d8-9ca3-7602f659dac6-kube-api-access-88tpk\") pod \"route-controller-manager-75fbfcbdd9-dh49r\" (UID: \"808bc873-7767-43d8-9ca3-7602f659dac6\") " pod="openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.894387 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/523fa32b-68ac-491d-8246-d57c80030415-catalog-content\") pod \"community-operators-qqpmg\" (UID: \"523fa32b-68ac-491d-8246-d57c80030415\") " pod="openshift-marketplace/community-operators-qqpmg" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.894454 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/523fa32b-68ac-491d-8246-d57c80030415-utilities\") pod \"community-operators-qqpmg\" (UID: \"523fa32b-68ac-491d-8246-d57c80030415\") " pod="openshift-marketplace/community-operators-qqpmg" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.894483 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/808bc873-7767-43d8-9ca3-7602f659dac6-config\") pod \"route-controller-manager-75fbfcbdd9-dh49r\" (UID: \"808bc873-7767-43d8-9ca3-7602f659dac6\") " pod="openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.894872 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/523fa32b-68ac-491d-8246-d57c80030415-utilities\") pod \"community-operators-qqpmg\" (UID: \"523fa32b-68ac-491d-8246-d57c80030415\") " pod="openshift-marketplace/community-operators-qqpmg" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.894990 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/523fa32b-68ac-491d-8246-d57c80030415-catalog-content\") pod \"community-operators-qqpmg\" (UID: \"523fa32b-68ac-491d-8246-d57c80030415\") " pod="openshift-marketplace/community-operators-qqpmg" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.909264 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szqcp\" (UniqueName: \"kubernetes.io/projected/523fa32b-68ac-491d-8246-d57c80030415-kube-api-access-szqcp\") pod \"community-operators-qqpmg\" (UID: \"523fa32b-68ac-491d-8246-d57c80030415\") " pod="openshift-marketplace/community-operators-qqpmg" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.990824 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qqpmg" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.995877 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/808bc873-7767-43d8-9ca3-7602f659dac6-serving-cert\") pod \"route-controller-manager-75fbfcbdd9-dh49r\" (UID: \"808bc873-7767-43d8-9ca3-7602f659dac6\") " pod="openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.995999 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88tpk\" (UniqueName: \"kubernetes.io/projected/808bc873-7767-43d8-9ca3-7602f659dac6-kube-api-access-88tpk\") pod \"route-controller-manager-75fbfcbdd9-dh49r\" (UID: \"808bc873-7767-43d8-9ca3-7602f659dac6\") " pod="openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.996100 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzvtz\" (UniqueName: \"kubernetes.io/projected/4e51fd14-b652-4c2f-8919-bca6fe51cc06-kube-api-access-hzvtz\") pod \"redhat-marketplace-jsb8t\" (UID: \"4e51fd14-b652-4c2f-8919-bca6fe51cc06\") " pod="openshift-marketplace/redhat-marketplace-jsb8t" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.996186 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e51fd14-b652-4c2f-8919-bca6fe51cc06-utilities\") pod \"redhat-marketplace-jsb8t\" (UID: \"4e51fd14-b652-4c2f-8919-bca6fe51cc06\") " pod="openshift-marketplace/redhat-marketplace-jsb8t" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.996270 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/808bc873-7767-43d8-9ca3-7602f659dac6-config\") pod \"route-controller-manager-75fbfcbdd9-dh49r\" (UID: \"808bc873-7767-43d8-9ca3-7602f659dac6\") " pod="openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.996433 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e51fd14-b652-4c2f-8919-bca6fe51cc06-catalog-content\") pod \"redhat-marketplace-jsb8t\" (UID: \"4e51fd14-b652-4c2f-8919-bca6fe51cc06\") " pod="openshift-marketplace/redhat-marketplace-jsb8t" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.996550 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/808bc873-7767-43d8-9ca3-7602f659dac6-client-ca\") pod \"route-controller-manager-75fbfcbdd9-dh49r\" (UID: \"808bc873-7767-43d8-9ca3-7602f659dac6\") " pod="openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.997176 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/808bc873-7767-43d8-9ca3-7602f659dac6-client-ca\") pod \"route-controller-manager-75fbfcbdd9-dh49r\" (UID: \"808bc873-7767-43d8-9ca3-7602f659dac6\") " pod="openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.997252 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/808bc873-7767-43d8-9ca3-7602f659dac6-config\") pod \"route-controller-manager-75fbfcbdd9-dh49r\" (UID: \"808bc873-7767-43d8-9ca3-7602f659dac6\") " pod="openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r" Dec 09 15:03:20 crc kubenswrapper[4555]: I1209 15:03:20.999667 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/808bc873-7767-43d8-9ca3-7602f659dac6-serving-cert\") pod \"route-controller-manager-75fbfcbdd9-dh49r\" (UID: \"808bc873-7767-43d8-9ca3-7602f659dac6\") " pod="openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r" Dec 09 15:03:21 crc kubenswrapper[4555]: I1209 15:03:21.008342 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88tpk\" (UniqueName: \"kubernetes.io/projected/808bc873-7767-43d8-9ca3-7602f659dac6-kube-api-access-88tpk\") pod \"route-controller-manager-75fbfcbdd9-dh49r\" (UID: \"808bc873-7767-43d8-9ca3-7602f659dac6\") " pod="openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r" Dec 09 15:03:21 crc kubenswrapper[4555]: I1209 15:03:21.097863 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzvtz\" (UniqueName: \"kubernetes.io/projected/4e51fd14-b652-4c2f-8919-bca6fe51cc06-kube-api-access-hzvtz\") pod \"redhat-marketplace-jsb8t\" (UID: \"4e51fd14-b652-4c2f-8919-bca6fe51cc06\") " pod="openshift-marketplace/redhat-marketplace-jsb8t" Dec 09 15:03:21 crc kubenswrapper[4555]: I1209 15:03:21.098038 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e51fd14-b652-4c2f-8919-bca6fe51cc06-utilities\") pod \"redhat-marketplace-jsb8t\" (UID: \"4e51fd14-b652-4c2f-8919-bca6fe51cc06\") " pod="openshift-marketplace/redhat-marketplace-jsb8t" Dec 09 15:03:21 crc kubenswrapper[4555]: I1209 15:03:21.098105 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e51fd14-b652-4c2f-8919-bca6fe51cc06-catalog-content\") pod \"redhat-marketplace-jsb8t\" (UID: \"4e51fd14-b652-4c2f-8919-bca6fe51cc06\") " pod="openshift-marketplace/redhat-marketplace-jsb8t" Dec 09 15:03:21 crc kubenswrapper[4555]: I1209 15:03:21.098594 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e51fd14-b652-4c2f-8919-bca6fe51cc06-catalog-content\") pod \"redhat-marketplace-jsb8t\" (UID: \"4e51fd14-b652-4c2f-8919-bca6fe51cc06\") " pod="openshift-marketplace/redhat-marketplace-jsb8t" Dec 09 15:03:21 crc kubenswrapper[4555]: I1209 15:03:21.099582 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e51fd14-b652-4c2f-8919-bca6fe51cc06-utilities\") pod \"redhat-marketplace-jsb8t\" (UID: \"4e51fd14-b652-4c2f-8919-bca6fe51cc06\") " pod="openshift-marketplace/redhat-marketplace-jsb8t" Dec 09 15:03:21 crc kubenswrapper[4555]: I1209 15:03:21.111553 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzvtz\" (UniqueName: \"kubernetes.io/projected/4e51fd14-b652-4c2f-8919-bca6fe51cc06-kube-api-access-hzvtz\") pod \"redhat-marketplace-jsb8t\" (UID: \"4e51fd14-b652-4c2f-8919-bca6fe51cc06\") " pod="openshift-marketplace/redhat-marketplace-jsb8t" Dec 09 15:03:21 crc kubenswrapper[4555]: I1209 15:03:21.154395 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r" Dec 09 15:03:21 crc kubenswrapper[4555]: I1209 15:03:21.188871 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jsb8t" Dec 09 15:03:21 crc kubenswrapper[4555]: I1209 15:03:21.326886 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qqpmg"] Dec 09 15:03:21 crc kubenswrapper[4555]: W1209 15:03:21.332032 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod523fa32b_68ac_491d_8246_d57c80030415.slice/crio-80052ab414201114be477a22dd09ce2d03d21cf7f64d487c5b52d60a0c12fbec WatchSource:0}: Error finding container 80052ab414201114be477a22dd09ce2d03d21cf7f64d487c5b52d60a0c12fbec: Status 404 returned error can't find the container with id 80052ab414201114be477a22dd09ce2d03d21cf7f64d487c5b52d60a0c12fbec Dec 09 15:03:21 crc kubenswrapper[4555]: I1209 15:03:21.493380 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r"] Dec 09 15:03:21 crc kubenswrapper[4555]: I1209 15:03:21.545346 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jsb8t"] Dec 09 15:03:21 crc kubenswrapper[4555]: W1209 15:03:21.546894 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e51fd14_b652_4c2f_8919_bca6fe51cc06.slice/crio-507a9685628fd0a9161f3282b1e1369470972be7ab1bdea28d10d14dd0e063ed WatchSource:0}: Error finding container 507a9685628fd0a9161f3282b1e1369470972be7ab1bdea28d10d14dd0e063ed: Status 404 returned error can't find the container with id 507a9685628fd0a9161f3282b1e1369470972be7ab1bdea28d10d14dd0e063ed Dec 09 15:03:21 crc kubenswrapper[4555]: I1209 15:03:21.583350 4555 generic.go:334] "Generic (PLEG): container finished" podID="523fa32b-68ac-491d-8246-d57c80030415" containerID="64041ae114774173021d1801d89da2a07538a6a5901f0b7106504afd958f0a4f" exitCode=0 Dec 09 15:03:21 crc kubenswrapper[4555]: I1209 15:03:21.583397 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqpmg" event={"ID":"523fa32b-68ac-491d-8246-d57c80030415","Type":"ContainerDied","Data":"64041ae114774173021d1801d89da2a07538a6a5901f0b7106504afd958f0a4f"} Dec 09 15:03:21 crc kubenswrapper[4555]: I1209 15:03:21.583446 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqpmg" event={"ID":"523fa32b-68ac-491d-8246-d57c80030415","Type":"ContainerStarted","Data":"80052ab414201114be477a22dd09ce2d03d21cf7f64d487c5b52d60a0c12fbec"} Dec 09 15:03:21 crc kubenswrapper[4555]: I1209 15:03:21.589430 4555 generic.go:334] "Generic (PLEG): container finished" podID="b25e4e41-17e1-4ca8-8761-d953a49b0f22" containerID="6c5bcbc7a2374bffc75a184e4a2488d9a4a3429d4474a1c345e94a02c6c9c863" exitCode=0 Dec 09 15:03:21 crc kubenswrapper[4555]: I1209 15:03:21.589480 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fflnp" event={"ID":"b25e4e41-17e1-4ca8-8761-d953a49b0f22","Type":"ContainerDied","Data":"6c5bcbc7a2374bffc75a184e4a2488d9a4a3429d4474a1c345e94a02c6c9c863"} Dec 09 15:03:21 crc kubenswrapper[4555]: I1209 15:03:21.591871 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r" event={"ID":"808bc873-7767-43d8-9ca3-7602f659dac6","Type":"ContainerStarted","Data":"2df572ab1fb637266a053aa6564bdde838fca4f6b908943b583bc1d7ea73242f"} Dec 09 15:03:21 crc kubenswrapper[4555]: I1209 15:03:21.595256 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9bn7" event={"ID":"ca7b78bf-7f50-4515-98b8-f7581c0e90d1","Type":"ContainerStarted","Data":"1d49199e813a49e95117448356c86e2b735ca1aae606cd3f8b71e8698abe4698"} Dec 09 15:03:21 crc kubenswrapper[4555]: I1209 15:03:21.597742 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jsb8t" event={"ID":"4e51fd14-b652-4c2f-8919-bca6fe51cc06","Type":"ContainerStarted","Data":"507a9685628fd0a9161f3282b1e1369470972be7ab1bdea28d10d14dd0e063ed"} Dec 09 15:03:21 crc kubenswrapper[4555]: I1209 15:03:21.623826 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k9bn7" podStartSLOduration=2.134248978 podStartE2EDuration="3.623811294s" podCreationTimestamp="2025-12-09 15:03:18 +0000 UTC" firstStartedPulling="2025-12-09 15:03:19.564328497 +0000 UTC m=+309.701041216" lastFinishedPulling="2025-12-09 15:03:21.053890814 +0000 UTC m=+311.190603532" observedRunningTime="2025-12-09 15:03:21.621946473 +0000 UTC m=+311.758659191" watchObservedRunningTime="2025-12-09 15:03:21.623811294 +0000 UTC m=+311.760524012" Dec 09 15:03:22 crc kubenswrapper[4555]: I1209 15:03:22.165870 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5feebc2-24be-46a3-a5d7-6e605bc6a073" path="/var/lib/kubelet/pods/f5feebc2-24be-46a3-a5d7-6e605bc6a073/volumes" Dec 09 15:03:22 crc kubenswrapper[4555]: I1209 15:03:22.602743 4555 generic.go:334] "Generic (PLEG): container finished" podID="4e51fd14-b652-4c2f-8919-bca6fe51cc06" containerID="951b80b6e6fdad9807745c2f693c59cca143cf4851814be13fdb535bef585603" exitCode=0 Dec 09 15:03:22 crc kubenswrapper[4555]: I1209 15:03:22.602810 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jsb8t" event={"ID":"4e51fd14-b652-4c2f-8919-bca6fe51cc06","Type":"ContainerDied","Data":"951b80b6e6fdad9807745c2f693c59cca143cf4851814be13fdb535bef585603"} Dec 09 15:03:22 crc kubenswrapper[4555]: I1209 15:03:22.605044 4555 generic.go:334] "Generic (PLEG): container finished" podID="523fa32b-68ac-491d-8246-d57c80030415" containerID="f10fc74ad1e1548efa7daaecd3395134024c213df05ad19b84eb5f6895f800ac" exitCode=0 Dec 09 15:03:22 crc kubenswrapper[4555]: I1209 15:03:22.605091 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqpmg" event={"ID":"523fa32b-68ac-491d-8246-d57c80030415","Type":"ContainerDied","Data":"f10fc74ad1e1548efa7daaecd3395134024c213df05ad19b84eb5f6895f800ac"} Dec 09 15:03:22 crc kubenswrapper[4555]: I1209 15:03:22.607084 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fflnp" event={"ID":"b25e4e41-17e1-4ca8-8761-d953a49b0f22","Type":"ContainerStarted","Data":"41770fbe758b546e8c91eda8682c09a3185143bfaf6c639c375c46edf538f8c6"} Dec 09 15:03:22 crc kubenswrapper[4555]: I1209 15:03:22.608047 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r" event={"ID":"808bc873-7767-43d8-9ca3-7602f659dac6","Type":"ContainerStarted","Data":"1ac5bfcff583c2f49dfed44c6451e304daa5b1d4368b86dbcbc724d38faf0e23"} Dec 09 15:03:22 crc kubenswrapper[4555]: I1209 15:03:22.640778 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fflnp" podStartSLOduration=2.132721926 podStartE2EDuration="4.640763216s" podCreationTimestamp="2025-12-09 15:03:18 +0000 UTC" firstStartedPulling="2025-12-09 15:03:19.565454682 +0000 UTC m=+309.702167399" lastFinishedPulling="2025-12-09 15:03:22.073495971 +0000 UTC m=+312.210208689" observedRunningTime="2025-12-09 15:03:22.637592239 +0000 UTC m=+312.774304957" watchObservedRunningTime="2025-12-09 15:03:22.640763216 +0000 UTC m=+312.777475934" Dec 09 15:03:22 crc kubenswrapper[4555]: I1209 15:03:22.647976 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r" podStartSLOduration=3.647962734 podStartE2EDuration="3.647962734s" podCreationTimestamp="2025-12-09 15:03:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:03:22.647630345 +0000 UTC m=+312.784343062" watchObservedRunningTime="2025-12-09 15:03:22.647962734 +0000 UTC m=+312.784675451" Dec 09 15:03:23 crc kubenswrapper[4555]: I1209 15:03:23.613080 4555 generic.go:334] "Generic (PLEG): container finished" podID="4e51fd14-b652-4c2f-8919-bca6fe51cc06" containerID="ce1fa333fb29d874b33f5f30327d517424609fd716acf3a101b9c9bae410fdcc" exitCode=0 Dec 09 15:03:23 crc kubenswrapper[4555]: I1209 15:03:23.613162 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jsb8t" event={"ID":"4e51fd14-b652-4c2f-8919-bca6fe51cc06","Type":"ContainerDied","Data":"ce1fa333fb29d874b33f5f30327d517424609fd716acf3a101b9c9bae410fdcc"} Dec 09 15:03:23 crc kubenswrapper[4555]: I1209 15:03:23.615215 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqpmg" event={"ID":"523fa32b-68ac-491d-8246-d57c80030415","Type":"ContainerStarted","Data":"2a39c9be7a5d94ed826f015e86a2b4a0de71b5d28e7102f3b59bfb1e7c2dc0e2"} Dec 09 15:03:23 crc kubenswrapper[4555]: I1209 15:03:23.615376 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r" Dec 09 15:03:23 crc kubenswrapper[4555]: I1209 15:03:23.620039 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r" Dec 09 15:03:23 crc kubenswrapper[4555]: I1209 15:03:23.637080 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qqpmg" podStartSLOduration=2.111548934 podStartE2EDuration="3.637070268s" podCreationTimestamp="2025-12-09 15:03:20 +0000 UTC" firstStartedPulling="2025-12-09 15:03:21.584739941 +0000 UTC m=+311.721452659" lastFinishedPulling="2025-12-09 15:03:23.110261275 +0000 UTC m=+313.246973993" observedRunningTime="2025-12-09 15:03:23.636102566 +0000 UTC m=+313.772815284" watchObservedRunningTime="2025-12-09 15:03:23.637070268 +0000 UTC m=+313.773782987" Dec 09 15:03:24 crc kubenswrapper[4555]: I1209 15:03:24.621729 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jsb8t" event={"ID":"4e51fd14-b652-4c2f-8919-bca6fe51cc06","Type":"ContainerStarted","Data":"5ffbfb92ae2ff0f55251a39f1a87ff025733dafcb0f16ec39a9dd114df9fef99"} Dec 09 15:03:25 crc kubenswrapper[4555]: I1209 15:03:25.450430 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/858a1ce9-f128-41e2-b6e0-3fabde0c8d33-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-tnf48\" (UID: \"858a1ce9-f128-41e2-b6e0-3fabde0c8d33\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tnf48" Dec 09 15:03:25 crc kubenswrapper[4555]: E1209 15:03:25.450563 4555 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Dec 09 15:03:25 crc kubenswrapper[4555]: E1209 15:03:25.450621 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/858a1ce9-f128-41e2-b6e0-3fabde0c8d33-tls-certificates podName:858a1ce9-f128-41e2-b6e0-3fabde0c8d33 nodeName:}" failed. No retries permitted until 2025-12-09 15:03:41.450608217 +0000 UTC m=+331.587320935 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/858a1ce9-f128-41e2-b6e0-3fabde0c8d33-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-tnf48" (UID: "858a1ce9-f128-41e2-b6e0-3fabde0c8d33") : secret "prometheus-operator-admission-webhook-tls" not found Dec 09 15:03:28 crc kubenswrapper[4555]: I1209 15:03:28.588003 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fflnp" Dec 09 15:03:28 crc kubenswrapper[4555]: I1209 15:03:28.588210 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fflnp" Dec 09 15:03:28 crc kubenswrapper[4555]: I1209 15:03:28.615395 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fflnp" Dec 09 15:03:28 crc kubenswrapper[4555]: I1209 15:03:28.630926 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jsb8t" podStartSLOduration=7.171238859 podStartE2EDuration="8.630913699s" podCreationTimestamp="2025-12-09 15:03:20 +0000 UTC" firstStartedPulling="2025-12-09 15:03:22.604531551 +0000 UTC m=+312.741244259" lastFinishedPulling="2025-12-09 15:03:24.06420638 +0000 UTC m=+314.200919099" observedRunningTime="2025-12-09 15:03:24.638324002 +0000 UTC m=+314.775036720" watchObservedRunningTime="2025-12-09 15:03:28.630913699 +0000 UTC m=+318.767626417" Dec 09 15:03:28 crc kubenswrapper[4555]: I1209 15:03:28.659645 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fflnp" Dec 09 15:03:28 crc kubenswrapper[4555]: I1209 15:03:28.784109 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k9bn7" Dec 09 15:03:28 crc kubenswrapper[4555]: I1209 15:03:28.784147 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k9bn7" Dec 09 15:03:28 crc kubenswrapper[4555]: I1209 15:03:28.809465 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k9bn7" Dec 09 15:03:29 crc kubenswrapper[4555]: I1209 15:03:29.668325 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k9bn7" Dec 09 15:03:30 crc kubenswrapper[4555]: I1209 15:03:30.991220 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qqpmg" Dec 09 15:03:30 crc kubenswrapper[4555]: I1209 15:03:30.991806 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qqpmg" Dec 09 15:03:31 crc kubenswrapper[4555]: I1209 15:03:31.016039 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qqpmg" Dec 09 15:03:31 crc kubenswrapper[4555]: I1209 15:03:31.189209 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jsb8t" Dec 09 15:03:31 crc kubenswrapper[4555]: I1209 15:03:31.189401 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jsb8t" Dec 09 15:03:31 crc kubenswrapper[4555]: I1209 15:03:31.214027 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jsb8t" Dec 09 15:03:31 crc kubenswrapper[4555]: I1209 15:03:31.672399 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jsb8t" Dec 09 15:03:31 crc kubenswrapper[4555]: I1209 15:03:31.672447 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qqpmg" Dec 09 15:03:41 crc kubenswrapper[4555]: I1209 15:03:41.507719 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/858a1ce9-f128-41e2-b6e0-3fabde0c8d33-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-tnf48\" (UID: \"858a1ce9-f128-41e2-b6e0-3fabde0c8d33\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tnf48" Dec 09 15:03:41 crc kubenswrapper[4555]: E1209 15:03:41.507884 4555 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Dec 09 15:03:41 crc kubenswrapper[4555]: E1209 15:03:41.508066 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/858a1ce9-f128-41e2-b6e0-3fabde0c8d33-tls-certificates podName:858a1ce9-f128-41e2-b6e0-3fabde0c8d33 nodeName:}" failed. No retries permitted until 2025-12-09 15:04:13.508053412 +0000 UTC m=+363.644766130 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/858a1ce9-f128-41e2-b6e0-3fabde0c8d33-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-tnf48" (UID: "858a1ce9-f128-41e2-b6e0-3fabde0c8d33") : secret "prometheus-operator-admission-webhook-tls" not found Dec 09 15:03:49 crc kubenswrapper[4555]: I1209 15:03:49.894258 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r"] Dec 09 15:03:49 crc kubenswrapper[4555]: I1209 15:03:49.894799 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r" podUID="808bc873-7767-43d8-9ca3-7602f659dac6" containerName="route-controller-manager" containerID="cri-o://1ac5bfcff583c2f49dfed44c6451e304daa5b1d4368b86dbcbc724d38faf0e23" gracePeriod=30 Dec 09 15:03:50 crc kubenswrapper[4555]: I1209 15:03:50.220104 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r" Dec 09 15:03:50 crc kubenswrapper[4555]: I1209 15:03:50.301510 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/808bc873-7767-43d8-9ca3-7602f659dac6-serving-cert\") pod \"808bc873-7767-43d8-9ca3-7602f659dac6\" (UID: \"808bc873-7767-43d8-9ca3-7602f659dac6\") " Dec 09 15:03:50 crc kubenswrapper[4555]: I1209 15:03:50.301806 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/808bc873-7767-43d8-9ca3-7602f659dac6-client-ca\") pod \"808bc873-7767-43d8-9ca3-7602f659dac6\" (UID: \"808bc873-7767-43d8-9ca3-7602f659dac6\") " Dec 09 15:03:50 crc kubenswrapper[4555]: I1209 15:03:50.301871 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88tpk\" (UniqueName: \"kubernetes.io/projected/808bc873-7767-43d8-9ca3-7602f659dac6-kube-api-access-88tpk\") pod \"808bc873-7767-43d8-9ca3-7602f659dac6\" (UID: \"808bc873-7767-43d8-9ca3-7602f659dac6\") " Dec 09 15:03:50 crc kubenswrapper[4555]: I1209 15:03:50.301892 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/808bc873-7767-43d8-9ca3-7602f659dac6-config\") pod \"808bc873-7767-43d8-9ca3-7602f659dac6\" (UID: \"808bc873-7767-43d8-9ca3-7602f659dac6\") " Dec 09 15:03:50 crc kubenswrapper[4555]: I1209 15:03:50.302345 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/808bc873-7767-43d8-9ca3-7602f659dac6-client-ca" (OuterVolumeSpecName: "client-ca") pod "808bc873-7767-43d8-9ca3-7602f659dac6" (UID: "808bc873-7767-43d8-9ca3-7602f659dac6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:03:50 crc kubenswrapper[4555]: I1209 15:03:50.302419 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/808bc873-7767-43d8-9ca3-7602f659dac6-config" (OuterVolumeSpecName: "config") pod "808bc873-7767-43d8-9ca3-7602f659dac6" (UID: "808bc873-7767-43d8-9ca3-7602f659dac6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:03:50 crc kubenswrapper[4555]: I1209 15:03:50.305958 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/808bc873-7767-43d8-9ca3-7602f659dac6-kube-api-access-88tpk" (OuterVolumeSpecName: "kube-api-access-88tpk") pod "808bc873-7767-43d8-9ca3-7602f659dac6" (UID: "808bc873-7767-43d8-9ca3-7602f659dac6"). InnerVolumeSpecName "kube-api-access-88tpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:03:50 crc kubenswrapper[4555]: I1209 15:03:50.306002 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/808bc873-7767-43d8-9ca3-7602f659dac6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "808bc873-7767-43d8-9ca3-7602f659dac6" (UID: "808bc873-7767-43d8-9ca3-7602f659dac6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:03:50 crc kubenswrapper[4555]: I1209 15:03:50.403237 4555 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/808bc873-7767-43d8-9ca3-7602f659dac6-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:50 crc kubenswrapper[4555]: I1209 15:03:50.403265 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88tpk\" (UniqueName: \"kubernetes.io/projected/808bc873-7767-43d8-9ca3-7602f659dac6-kube-api-access-88tpk\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:50 crc kubenswrapper[4555]: I1209 15:03:50.403276 4555 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/808bc873-7767-43d8-9ca3-7602f659dac6-config\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:50 crc kubenswrapper[4555]: I1209 15:03:50.403285 4555 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/808bc873-7767-43d8-9ca3-7602f659dac6-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:50 crc kubenswrapper[4555]: I1209 15:03:50.721729 4555 generic.go:334] "Generic (PLEG): container finished" podID="808bc873-7767-43d8-9ca3-7602f659dac6" containerID="1ac5bfcff583c2f49dfed44c6451e304daa5b1d4368b86dbcbc724d38faf0e23" exitCode=0 Dec 09 15:03:50 crc kubenswrapper[4555]: I1209 15:03:50.721785 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r" Dec 09 15:03:50 crc kubenswrapper[4555]: I1209 15:03:50.721789 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r" event={"ID":"808bc873-7767-43d8-9ca3-7602f659dac6","Type":"ContainerDied","Data":"1ac5bfcff583c2f49dfed44c6451e304daa5b1d4368b86dbcbc724d38faf0e23"} Dec 09 15:03:50 crc kubenswrapper[4555]: I1209 15:03:50.721931 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r" event={"ID":"808bc873-7767-43d8-9ca3-7602f659dac6","Type":"ContainerDied","Data":"2df572ab1fb637266a053aa6564bdde838fca4f6b908943b583bc1d7ea73242f"} Dec 09 15:03:50 crc kubenswrapper[4555]: I1209 15:03:50.721959 4555 scope.go:117] "RemoveContainer" containerID="1ac5bfcff583c2f49dfed44c6451e304daa5b1d4368b86dbcbc724d38faf0e23" Dec 09 15:03:50 crc kubenswrapper[4555]: I1209 15:03:50.735576 4555 scope.go:117] "RemoveContainer" containerID="1ac5bfcff583c2f49dfed44c6451e304daa5b1d4368b86dbcbc724d38faf0e23" Dec 09 15:03:50 crc kubenswrapper[4555]: E1209 15:03:50.735900 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ac5bfcff583c2f49dfed44c6451e304daa5b1d4368b86dbcbc724d38faf0e23\": container with ID starting with 1ac5bfcff583c2f49dfed44c6451e304daa5b1d4368b86dbcbc724d38faf0e23 not found: ID does not exist" containerID="1ac5bfcff583c2f49dfed44c6451e304daa5b1d4368b86dbcbc724d38faf0e23" Dec 09 15:03:50 crc kubenswrapper[4555]: I1209 15:03:50.735927 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ac5bfcff583c2f49dfed44c6451e304daa5b1d4368b86dbcbc724d38faf0e23"} err="failed to get container status \"1ac5bfcff583c2f49dfed44c6451e304daa5b1d4368b86dbcbc724d38faf0e23\": rpc error: code = NotFound desc = could not find container \"1ac5bfcff583c2f49dfed44c6451e304daa5b1d4368b86dbcbc724d38faf0e23\": container with ID starting with 1ac5bfcff583c2f49dfed44c6451e304daa5b1d4368b86dbcbc724d38faf0e23 not found: ID does not exist" Dec 09 15:03:50 crc kubenswrapper[4555]: I1209 15:03:50.739905 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r"] Dec 09 15:03:50 crc kubenswrapper[4555]: I1209 15:03:50.742512 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75fbfcbdd9-dh49r"] Dec 09 15:03:51 crc kubenswrapper[4555]: I1209 15:03:51.860604 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6445bb5c56-f9vkt"] Dec 09 15:03:51 crc kubenswrapper[4555]: E1209 15:03:51.861258 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="808bc873-7767-43d8-9ca3-7602f659dac6" containerName="route-controller-manager" Dec 09 15:03:51 crc kubenswrapper[4555]: I1209 15:03:51.861274 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="808bc873-7767-43d8-9ca3-7602f659dac6" containerName="route-controller-manager" Dec 09 15:03:51 crc kubenswrapper[4555]: I1209 15:03:51.861504 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="808bc873-7767-43d8-9ca3-7602f659dac6" containerName="route-controller-manager" Dec 09 15:03:51 crc kubenswrapper[4555]: I1209 15:03:51.862043 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-f9vkt" Dec 09 15:03:51 crc kubenswrapper[4555]: I1209 15:03:51.867260 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 09 15:03:51 crc kubenswrapper[4555]: I1209 15:03:51.867489 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 09 15:03:51 crc kubenswrapper[4555]: I1209 15:03:51.867833 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 09 15:03:51 crc kubenswrapper[4555]: I1209 15:03:51.869059 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 09 15:03:51 crc kubenswrapper[4555]: I1209 15:03:51.869157 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 09 15:03:51 crc kubenswrapper[4555]: I1209 15:03:51.869206 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 09 15:03:51 crc kubenswrapper[4555]: I1209 15:03:51.878268 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6445bb5c56-f9vkt"] Dec 09 15:03:51 crc kubenswrapper[4555]: I1209 15:03:51.921588 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17d162a6-c34c-44c2-866d-6c1e85cd32e9-serving-cert\") pod \"route-controller-manager-6445bb5c56-f9vkt\" (UID: \"17d162a6-c34c-44c2-866d-6c1e85cd32e9\") " pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-f9vkt" Dec 09 15:03:51 crc kubenswrapper[4555]: I1209 15:03:51.921736 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lprcg\" (UniqueName: \"kubernetes.io/projected/17d162a6-c34c-44c2-866d-6c1e85cd32e9-kube-api-access-lprcg\") pod \"route-controller-manager-6445bb5c56-f9vkt\" (UID: \"17d162a6-c34c-44c2-866d-6c1e85cd32e9\") " pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-f9vkt" Dec 09 15:03:51 crc kubenswrapper[4555]: I1209 15:03:51.921859 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17d162a6-c34c-44c2-866d-6c1e85cd32e9-client-ca\") pod \"route-controller-manager-6445bb5c56-f9vkt\" (UID: \"17d162a6-c34c-44c2-866d-6c1e85cd32e9\") " pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-f9vkt" Dec 09 15:03:51 crc kubenswrapper[4555]: I1209 15:03:51.921921 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17d162a6-c34c-44c2-866d-6c1e85cd32e9-config\") pod \"route-controller-manager-6445bb5c56-f9vkt\" (UID: \"17d162a6-c34c-44c2-866d-6c1e85cd32e9\") " pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-f9vkt" Dec 09 15:03:52 crc kubenswrapper[4555]: I1209 15:03:52.022546 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17d162a6-c34c-44c2-866d-6c1e85cd32e9-client-ca\") pod \"route-controller-manager-6445bb5c56-f9vkt\" (UID: \"17d162a6-c34c-44c2-866d-6c1e85cd32e9\") " pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-f9vkt" Dec 09 15:03:52 crc kubenswrapper[4555]: I1209 15:03:52.022588 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17d162a6-c34c-44c2-866d-6c1e85cd32e9-config\") pod \"route-controller-manager-6445bb5c56-f9vkt\" (UID: \"17d162a6-c34c-44c2-866d-6c1e85cd32e9\") " pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-f9vkt" Dec 09 15:03:52 crc kubenswrapper[4555]: I1209 15:03:52.022658 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17d162a6-c34c-44c2-866d-6c1e85cd32e9-serving-cert\") pod \"route-controller-manager-6445bb5c56-f9vkt\" (UID: \"17d162a6-c34c-44c2-866d-6c1e85cd32e9\") " pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-f9vkt" Dec 09 15:03:52 crc kubenswrapper[4555]: I1209 15:03:52.022702 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lprcg\" (UniqueName: \"kubernetes.io/projected/17d162a6-c34c-44c2-866d-6c1e85cd32e9-kube-api-access-lprcg\") pod \"route-controller-manager-6445bb5c56-f9vkt\" (UID: \"17d162a6-c34c-44c2-866d-6c1e85cd32e9\") " pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-f9vkt" Dec 09 15:03:52 crc kubenswrapper[4555]: I1209 15:03:52.023519 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17d162a6-c34c-44c2-866d-6c1e85cd32e9-client-ca\") pod \"route-controller-manager-6445bb5c56-f9vkt\" (UID: \"17d162a6-c34c-44c2-866d-6c1e85cd32e9\") " pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-f9vkt" Dec 09 15:03:52 crc kubenswrapper[4555]: I1209 15:03:52.023698 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17d162a6-c34c-44c2-866d-6c1e85cd32e9-config\") pod \"route-controller-manager-6445bb5c56-f9vkt\" (UID: \"17d162a6-c34c-44c2-866d-6c1e85cd32e9\") " pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-f9vkt" Dec 09 15:03:52 crc kubenswrapper[4555]: I1209 15:03:52.026015 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17d162a6-c34c-44c2-866d-6c1e85cd32e9-serving-cert\") pod \"route-controller-manager-6445bb5c56-f9vkt\" (UID: \"17d162a6-c34c-44c2-866d-6c1e85cd32e9\") " pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-f9vkt" Dec 09 15:03:52 crc kubenswrapper[4555]: I1209 15:03:52.037869 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lprcg\" (UniqueName: \"kubernetes.io/projected/17d162a6-c34c-44c2-866d-6c1e85cd32e9-kube-api-access-lprcg\") pod \"route-controller-manager-6445bb5c56-f9vkt\" (UID: \"17d162a6-c34c-44c2-866d-6c1e85cd32e9\") " pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-f9vkt" Dec 09 15:03:52 crc kubenswrapper[4555]: I1209 15:03:52.165798 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="808bc873-7767-43d8-9ca3-7602f659dac6" path="/var/lib/kubelet/pods/808bc873-7767-43d8-9ca3-7602f659dac6/volumes" Dec 09 15:03:52 crc kubenswrapper[4555]: I1209 15:03:52.184676 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-f9vkt" Dec 09 15:03:52 crc kubenswrapper[4555]: I1209 15:03:52.539139 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6445bb5c56-f9vkt"] Dec 09 15:03:52 crc kubenswrapper[4555]: I1209 15:03:52.731171 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-f9vkt" event={"ID":"17d162a6-c34c-44c2-866d-6c1e85cd32e9","Type":"ContainerStarted","Data":"d924e19e10d8134bc73927f8d9cf67bbb6171e873eac735e8ccd76a2f5e70f79"} Dec 09 15:03:52 crc kubenswrapper[4555]: I1209 15:03:52.731211 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-f9vkt" event={"ID":"17d162a6-c34c-44c2-866d-6c1e85cd32e9","Type":"ContainerStarted","Data":"df82590ac5de8b72a8d3459271f3e84ece0d81215cdf7f2cf2d1c2d4e2a67847"} Dec 09 15:03:52 crc kubenswrapper[4555]: I1209 15:03:52.731411 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-f9vkt" Dec 09 15:03:52 crc kubenswrapper[4555]: I1209 15:03:52.742837 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-f9vkt" podStartSLOduration=3.74282571 podStartE2EDuration="3.74282571s" podCreationTimestamp="2025-12-09 15:03:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:03:52.742043253 +0000 UTC m=+342.878755971" watchObservedRunningTime="2025-12-09 15:03:52.74282571 +0000 UTC m=+342.879538428" Dec 09 15:03:52 crc kubenswrapper[4555]: I1209 15:03:52.892990 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6445bb5c56-f9vkt" Dec 09 15:03:55 crc kubenswrapper[4555]: I1209 15:03:55.532903 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:03:55 crc kubenswrapper[4555]: I1209 15:03:55.533699 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:04:11 crc kubenswrapper[4555]: I1209 15:04:11.462540 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xfw89"] Dec 09 15:04:11 crc kubenswrapper[4555]: I1209 15:04:11.463449 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" Dec 09 15:04:11 crc kubenswrapper[4555]: I1209 15:04:11.473151 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xfw89"] Dec 09 15:04:11 crc kubenswrapper[4555]: I1209 15:04:11.541991 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd320c53-650c-41b8-86d6-34866e814aa2-bound-sa-token\") pod \"image-registry-66df7c8f76-xfw89\" (UID: \"bd320c53-650c-41b8-86d6-34866e814aa2\") " pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" Dec 09 15:04:11 crc kubenswrapper[4555]: I1209 15:04:11.542046 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w8hd\" (UniqueName: \"kubernetes.io/projected/bd320c53-650c-41b8-86d6-34866e814aa2-kube-api-access-8w8hd\") pod \"image-registry-66df7c8f76-xfw89\" (UID: \"bd320c53-650c-41b8-86d6-34866e814aa2\") " pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" Dec 09 15:04:11 crc kubenswrapper[4555]: I1209 15:04:11.542073 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-xfw89\" (UID: \"bd320c53-650c-41b8-86d6-34866e814aa2\") " pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" Dec 09 15:04:11 crc kubenswrapper[4555]: I1209 15:04:11.542095 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bd320c53-650c-41b8-86d6-34866e814aa2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xfw89\" (UID: \"bd320c53-650c-41b8-86d6-34866e814aa2\") " pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" Dec 09 15:04:11 crc kubenswrapper[4555]: I1209 15:04:11.542129 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bd320c53-650c-41b8-86d6-34866e814aa2-registry-certificates\") pod \"image-registry-66df7c8f76-xfw89\" (UID: \"bd320c53-650c-41b8-86d6-34866e814aa2\") " pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" Dec 09 15:04:11 crc kubenswrapper[4555]: I1209 15:04:11.542388 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd320c53-650c-41b8-86d6-34866e814aa2-trusted-ca\") pod \"image-registry-66df7c8f76-xfw89\" (UID: \"bd320c53-650c-41b8-86d6-34866e814aa2\") " pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" Dec 09 15:04:11 crc kubenswrapper[4555]: I1209 15:04:11.542448 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bd320c53-650c-41b8-86d6-34866e814aa2-registry-tls\") pod \"image-registry-66df7c8f76-xfw89\" (UID: \"bd320c53-650c-41b8-86d6-34866e814aa2\") " pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" Dec 09 15:04:11 crc kubenswrapper[4555]: I1209 15:04:11.542467 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bd320c53-650c-41b8-86d6-34866e814aa2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xfw89\" (UID: \"bd320c53-650c-41b8-86d6-34866e814aa2\") " pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" Dec 09 15:04:11 crc kubenswrapper[4555]: I1209 15:04:11.562072 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-xfw89\" (UID: \"bd320c53-650c-41b8-86d6-34866e814aa2\") " pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" Dec 09 15:04:11 crc kubenswrapper[4555]: I1209 15:04:11.643968 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd320c53-650c-41b8-86d6-34866e814aa2-trusted-ca\") pod \"image-registry-66df7c8f76-xfw89\" (UID: \"bd320c53-650c-41b8-86d6-34866e814aa2\") " pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" Dec 09 15:04:11 crc kubenswrapper[4555]: I1209 15:04:11.644009 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bd320c53-650c-41b8-86d6-34866e814aa2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xfw89\" (UID: \"bd320c53-650c-41b8-86d6-34866e814aa2\") " pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" Dec 09 15:04:11 crc kubenswrapper[4555]: I1209 15:04:11.644029 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bd320c53-650c-41b8-86d6-34866e814aa2-registry-tls\") pod \"image-registry-66df7c8f76-xfw89\" (UID: \"bd320c53-650c-41b8-86d6-34866e814aa2\") " pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" Dec 09 15:04:11 crc kubenswrapper[4555]: I1209 15:04:11.644062 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd320c53-650c-41b8-86d6-34866e814aa2-bound-sa-token\") pod \"image-registry-66df7c8f76-xfw89\" (UID: \"bd320c53-650c-41b8-86d6-34866e814aa2\") " pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" Dec 09 15:04:11 crc kubenswrapper[4555]: I1209 15:04:11.644083 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w8hd\" (UniqueName: \"kubernetes.io/projected/bd320c53-650c-41b8-86d6-34866e814aa2-kube-api-access-8w8hd\") pod \"image-registry-66df7c8f76-xfw89\" (UID: \"bd320c53-650c-41b8-86d6-34866e814aa2\") " pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" Dec 09 15:04:11 crc kubenswrapper[4555]: I1209 15:04:11.644102 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bd320c53-650c-41b8-86d6-34866e814aa2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xfw89\" (UID: \"bd320c53-650c-41b8-86d6-34866e814aa2\") " pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" Dec 09 15:04:11 crc kubenswrapper[4555]: I1209 15:04:11.644130 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bd320c53-650c-41b8-86d6-34866e814aa2-registry-certificates\") pod \"image-registry-66df7c8f76-xfw89\" (UID: \"bd320c53-650c-41b8-86d6-34866e814aa2\") " pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" Dec 09 15:04:11 crc kubenswrapper[4555]: I1209 15:04:11.644469 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bd320c53-650c-41b8-86d6-34866e814aa2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xfw89\" (UID: \"bd320c53-650c-41b8-86d6-34866e814aa2\") " pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" Dec 09 15:04:11 crc kubenswrapper[4555]: I1209 15:04:11.645145 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bd320c53-650c-41b8-86d6-34866e814aa2-registry-certificates\") pod \"image-registry-66df7c8f76-xfw89\" (UID: \"bd320c53-650c-41b8-86d6-34866e814aa2\") " pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" Dec 09 15:04:11 crc kubenswrapper[4555]: I1209 15:04:11.645267 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd320c53-650c-41b8-86d6-34866e814aa2-trusted-ca\") pod \"image-registry-66df7c8f76-xfw89\" (UID: \"bd320c53-650c-41b8-86d6-34866e814aa2\") " pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" Dec 09 15:04:11 crc kubenswrapper[4555]: I1209 15:04:11.648320 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bd320c53-650c-41b8-86d6-34866e814aa2-registry-tls\") pod \"image-registry-66df7c8f76-xfw89\" (UID: \"bd320c53-650c-41b8-86d6-34866e814aa2\") " pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" Dec 09 15:04:11 crc kubenswrapper[4555]: I1209 15:04:11.649023 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bd320c53-650c-41b8-86d6-34866e814aa2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xfw89\" (UID: \"bd320c53-650c-41b8-86d6-34866e814aa2\") " pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" Dec 09 15:04:11 crc kubenswrapper[4555]: I1209 15:04:11.656422 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd320c53-650c-41b8-86d6-34866e814aa2-bound-sa-token\") pod \"image-registry-66df7c8f76-xfw89\" (UID: \"bd320c53-650c-41b8-86d6-34866e814aa2\") " pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" Dec 09 15:04:11 crc kubenswrapper[4555]: I1209 15:04:11.656997 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w8hd\" (UniqueName: \"kubernetes.io/projected/bd320c53-650c-41b8-86d6-34866e814aa2-kube-api-access-8w8hd\") pod \"image-registry-66df7c8f76-xfw89\" (UID: \"bd320c53-650c-41b8-86d6-34866e814aa2\") " pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" Dec 09 15:04:11 crc kubenswrapper[4555]: I1209 15:04:11.779069 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" Dec 09 15:04:12 crc kubenswrapper[4555]: I1209 15:04:12.105674 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xfw89"] Dec 09 15:04:12 crc kubenswrapper[4555]: W1209 15:04:12.109202 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd320c53_650c_41b8_86d6_34866e814aa2.slice/crio-aa9553a8dae6bee858821339f894cbee5b706a0eca5e4744b1161c4f09c4f534 WatchSource:0}: Error finding container aa9553a8dae6bee858821339f894cbee5b706a0eca5e4744b1161c4f09c4f534: Status 404 returned error can't find the container with id aa9553a8dae6bee858821339f894cbee5b706a0eca5e4744b1161c4f09c4f534 Dec 09 15:04:12 crc kubenswrapper[4555]: I1209 15:04:12.809107 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" event={"ID":"bd320c53-650c-41b8-86d6-34866e814aa2","Type":"ContainerStarted","Data":"d7057a472df1862d67763f336266f5ca00419b45ae05084ca7c3073cd5cc0594"} Dec 09 15:04:12 crc kubenswrapper[4555]: I1209 15:04:12.809265 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" event={"ID":"bd320c53-650c-41b8-86d6-34866e814aa2","Type":"ContainerStarted","Data":"aa9553a8dae6bee858821339f894cbee5b706a0eca5e4744b1161c4f09c4f534"} Dec 09 15:04:12 crc kubenswrapper[4555]: I1209 15:04:12.809280 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" Dec 09 15:04:12 crc kubenswrapper[4555]: I1209 15:04:12.826214 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" podStartSLOduration=1.826200606 podStartE2EDuration="1.826200606s" podCreationTimestamp="2025-12-09 15:04:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:04:12.8259849 +0000 UTC m=+362.962697668" watchObservedRunningTime="2025-12-09 15:04:12.826200606 +0000 UTC m=+362.962913324" Dec 09 15:04:13 crc kubenswrapper[4555]: I1209 15:04:13.564746 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/858a1ce9-f128-41e2-b6e0-3fabde0c8d33-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-tnf48\" (UID: \"858a1ce9-f128-41e2-b6e0-3fabde0c8d33\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tnf48" Dec 09 15:04:13 crc kubenswrapper[4555]: I1209 15:04:13.569793 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/858a1ce9-f128-41e2-b6e0-3fabde0c8d33-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-tnf48\" (UID: \"858a1ce9-f128-41e2-b6e0-3fabde0c8d33\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tnf48" Dec 09 15:04:13 crc kubenswrapper[4555]: I1209 15:04:13.810171 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tnf48" Dec 09 15:04:14 crc kubenswrapper[4555]: I1209 15:04:14.142050 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tnf48"] Dec 09 15:04:14 crc kubenswrapper[4555]: W1209 15:04:14.146094 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod858a1ce9_f128_41e2_b6e0_3fabde0c8d33.slice/crio-886c00653795a3206d046fa652b3f12a68a75c93290def98358f92c975d6b356 WatchSource:0}: Error finding container 886c00653795a3206d046fa652b3f12a68a75c93290def98358f92c975d6b356: Status 404 returned error can't find the container with id 886c00653795a3206d046fa652b3f12a68a75c93290def98358f92c975d6b356 Dec 09 15:04:14 crc kubenswrapper[4555]: I1209 15:04:14.817283 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tnf48" event={"ID":"858a1ce9-f128-41e2-b6e0-3fabde0c8d33","Type":"ContainerStarted","Data":"886c00653795a3206d046fa652b3f12a68a75c93290def98358f92c975d6b356"} Dec 09 15:04:15 crc kubenswrapper[4555]: I1209 15:04:15.821713 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tnf48" event={"ID":"858a1ce9-f128-41e2-b6e0-3fabde0c8d33","Type":"ContainerStarted","Data":"39e10ec3923943ea73732eb6ba00bb273ee68bae07e87fba58d6430fce4b1863"} Dec 09 15:04:15 crc kubenswrapper[4555]: I1209 15:04:15.822024 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tnf48" Dec 09 15:04:15 crc kubenswrapper[4555]: I1209 15:04:15.825768 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tnf48" Dec 09 15:04:15 crc kubenswrapper[4555]: I1209 15:04:15.831338 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tnf48" podStartSLOduration=65.566003785 podStartE2EDuration="1m6.831323916s" podCreationTimestamp="2025-12-09 15:03:09 +0000 UTC" firstStartedPulling="2025-12-09 15:04:14.147915588 +0000 UTC m=+364.284628306" lastFinishedPulling="2025-12-09 15:04:15.413235719 +0000 UTC m=+365.549948437" observedRunningTime="2025-12-09 15:04:15.829829615 +0000 UTC m=+365.966542333" watchObservedRunningTime="2025-12-09 15:04:15.831323916 +0000 UTC m=+365.968036634" Dec 09 15:04:16 crc kubenswrapper[4555]: I1209 15:04:16.636199 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-rjhbr"] Dec 09 15:04:16 crc kubenswrapper[4555]: I1209 15:04:16.637162 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-rjhbr" Dec 09 15:04:16 crc kubenswrapper[4555]: I1209 15:04:16.639373 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Dec 09 15:04:16 crc kubenswrapper[4555]: I1209 15:04:16.639427 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Dec 09 15:04:16 crc kubenswrapper[4555]: I1209 15:04:16.640015 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Dec 09 15:04:16 crc kubenswrapper[4555]: I1209 15:04:16.641557 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-ncfks" Dec 09 15:04:16 crc kubenswrapper[4555]: I1209 15:04:16.648492 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-rjhbr"] Dec 09 15:04:16 crc kubenswrapper[4555]: I1209 15:04:16.696716 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/480c6849-4967-4c3a-921d-5e3f38a32bda-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-rjhbr\" (UID: \"480c6849-4967-4c3a-921d-5e3f38a32bda\") " pod="openshift-monitoring/prometheus-operator-db54df47d-rjhbr" Dec 09 15:04:16 crc kubenswrapper[4555]: I1209 15:04:16.696783 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn9xl\" (UniqueName: \"kubernetes.io/projected/480c6849-4967-4c3a-921d-5e3f38a32bda-kube-api-access-wn9xl\") pod \"prometheus-operator-db54df47d-rjhbr\" (UID: \"480c6849-4967-4c3a-921d-5e3f38a32bda\") " pod="openshift-monitoring/prometheus-operator-db54df47d-rjhbr" Dec 09 15:04:16 crc kubenswrapper[4555]: I1209 15:04:16.696818 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/480c6849-4967-4c3a-921d-5e3f38a32bda-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-rjhbr\" (UID: \"480c6849-4967-4c3a-921d-5e3f38a32bda\") " pod="openshift-monitoring/prometheus-operator-db54df47d-rjhbr" Dec 09 15:04:16 crc kubenswrapper[4555]: I1209 15:04:16.697007 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/480c6849-4967-4c3a-921d-5e3f38a32bda-metrics-client-ca\") pod \"prometheus-operator-db54df47d-rjhbr\" (UID: \"480c6849-4967-4c3a-921d-5e3f38a32bda\") " pod="openshift-monitoring/prometheus-operator-db54df47d-rjhbr" Dec 09 15:04:16 crc kubenswrapper[4555]: I1209 15:04:16.798015 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/480c6849-4967-4c3a-921d-5e3f38a32bda-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-rjhbr\" (UID: \"480c6849-4967-4c3a-921d-5e3f38a32bda\") " pod="openshift-monitoring/prometheus-operator-db54df47d-rjhbr" Dec 09 15:04:16 crc kubenswrapper[4555]: I1209 15:04:16.798055 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn9xl\" (UniqueName: \"kubernetes.io/projected/480c6849-4967-4c3a-921d-5e3f38a32bda-kube-api-access-wn9xl\") pod \"prometheus-operator-db54df47d-rjhbr\" (UID: \"480c6849-4967-4c3a-921d-5e3f38a32bda\") " pod="openshift-monitoring/prometheus-operator-db54df47d-rjhbr" Dec 09 15:04:16 crc kubenswrapper[4555]: I1209 15:04:16.798074 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/480c6849-4967-4c3a-921d-5e3f38a32bda-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-rjhbr\" (UID: \"480c6849-4967-4c3a-921d-5e3f38a32bda\") " pod="openshift-monitoring/prometheus-operator-db54df47d-rjhbr" Dec 09 15:04:16 crc kubenswrapper[4555]: E1209 15:04:16.798120 4555 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Dec 09 15:04:16 crc kubenswrapper[4555]: E1209 15:04:16.798171 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/480c6849-4967-4c3a-921d-5e3f38a32bda-prometheus-operator-tls podName:480c6849-4967-4c3a-921d-5e3f38a32bda nodeName:}" failed. No retries permitted until 2025-12-09 15:04:17.298156296 +0000 UTC m=+367.434869014 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/480c6849-4967-4c3a-921d-5e3f38a32bda-prometheus-operator-tls") pod "prometheus-operator-db54df47d-rjhbr" (UID: "480c6849-4967-4c3a-921d-5e3f38a32bda") : secret "prometheus-operator-tls" not found Dec 09 15:04:16 crc kubenswrapper[4555]: I1209 15:04:16.798194 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/480c6849-4967-4c3a-921d-5e3f38a32bda-metrics-client-ca\") pod \"prometheus-operator-db54df47d-rjhbr\" (UID: \"480c6849-4967-4c3a-921d-5e3f38a32bda\") " pod="openshift-monitoring/prometheus-operator-db54df47d-rjhbr" Dec 09 15:04:16 crc kubenswrapper[4555]: I1209 15:04:16.799087 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/480c6849-4967-4c3a-921d-5e3f38a32bda-metrics-client-ca\") pod \"prometheus-operator-db54df47d-rjhbr\" (UID: \"480c6849-4967-4c3a-921d-5e3f38a32bda\") " pod="openshift-monitoring/prometheus-operator-db54df47d-rjhbr" Dec 09 15:04:16 crc kubenswrapper[4555]: I1209 15:04:16.803506 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/480c6849-4967-4c3a-921d-5e3f38a32bda-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-rjhbr\" (UID: \"480c6849-4967-4c3a-921d-5e3f38a32bda\") " pod="openshift-monitoring/prometheus-operator-db54df47d-rjhbr" Dec 09 15:04:16 crc kubenswrapper[4555]: I1209 15:04:16.812529 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn9xl\" (UniqueName: \"kubernetes.io/projected/480c6849-4967-4c3a-921d-5e3f38a32bda-kube-api-access-wn9xl\") pod \"prometheus-operator-db54df47d-rjhbr\" (UID: \"480c6849-4967-4c3a-921d-5e3f38a32bda\") " pod="openshift-monitoring/prometheus-operator-db54df47d-rjhbr" Dec 09 15:04:17 crc kubenswrapper[4555]: I1209 15:04:17.304322 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/480c6849-4967-4c3a-921d-5e3f38a32bda-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-rjhbr\" (UID: \"480c6849-4967-4c3a-921d-5e3f38a32bda\") " pod="openshift-monitoring/prometheus-operator-db54df47d-rjhbr" Dec 09 15:04:17 crc kubenswrapper[4555]: I1209 15:04:17.307407 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/480c6849-4967-4c3a-921d-5e3f38a32bda-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-rjhbr\" (UID: \"480c6849-4967-4c3a-921d-5e3f38a32bda\") " pod="openshift-monitoring/prometheus-operator-db54df47d-rjhbr" Dec 09 15:04:17 crc kubenswrapper[4555]: I1209 15:04:17.548388 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-rjhbr" Dec 09 15:04:17 crc kubenswrapper[4555]: I1209 15:04:17.878662 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-rjhbr"] Dec 09 15:04:17 crc kubenswrapper[4555]: W1209 15:04:17.882125 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod480c6849_4967_4c3a_921d_5e3f38a32bda.slice/crio-3966e1b0593165337e3495197c40e36a34fdc516bfab8539c25a1552ef988c0f WatchSource:0}: Error finding container 3966e1b0593165337e3495197c40e36a34fdc516bfab8539c25a1552ef988c0f: Status 404 returned error can't find the container with id 3966e1b0593165337e3495197c40e36a34fdc516bfab8539c25a1552ef988c0f Dec 09 15:04:18 crc kubenswrapper[4555]: I1209 15:04:18.832995 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-rjhbr" event={"ID":"480c6849-4967-4c3a-921d-5e3f38a32bda","Type":"ContainerStarted","Data":"3966e1b0593165337e3495197c40e36a34fdc516bfab8539c25a1552ef988c0f"} Dec 09 15:04:19 crc kubenswrapper[4555]: I1209 15:04:19.838483 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-rjhbr" event={"ID":"480c6849-4967-4c3a-921d-5e3f38a32bda","Type":"ContainerStarted","Data":"2c68e6e1b8baa3a6cdd358063b85f423a810677b94d51d99e4d67a2d207ccc63"} Dec 09 15:04:19 crc kubenswrapper[4555]: I1209 15:04:19.838722 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-rjhbr" event={"ID":"480c6849-4967-4c3a-921d-5e3f38a32bda","Type":"ContainerStarted","Data":"fc028d82953883fec54175d6135539bb37b3b04c4e3db031e15eb1abbe8fb574"} Dec 09 15:04:19 crc kubenswrapper[4555]: I1209 15:04:19.852401 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-db54df47d-rjhbr" podStartSLOduration=2.445438748 podStartE2EDuration="3.852372271s" podCreationTimestamp="2025-12-09 15:04:16 +0000 UTC" firstStartedPulling="2025-12-09 15:04:17.884291841 +0000 UTC m=+368.021004559" lastFinishedPulling="2025-12-09 15:04:19.291225364 +0000 UTC m=+369.427938082" observedRunningTime="2025-12-09 15:04:19.848391486 +0000 UTC m=+369.985104204" watchObservedRunningTime="2025-12-09 15:04:19.852372271 +0000 UTC m=+369.989084989" Dec 09 15:04:21 crc kubenswrapper[4555]: I1209 15:04:21.983861 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-v6jwh"] Dec 09 15:04:21 crc kubenswrapper[4555]: I1209 15:04:21.984716 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-v6jwh" Dec 09 15:04:21 crc kubenswrapper[4555]: W1209 15:04:21.989485 4555 reflector.go:561] object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config": failed to list *v1.Secret: secrets "node-exporter-kube-rbac-proxy-config" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-monitoring": no relationship found between node 'crc' and this object Dec 09 15:04:21 crc kubenswrapper[4555]: E1209 15:04:21.989696 4555 reflector.go:158] "Unhandled Error" err="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"node-exporter-kube-rbac-proxy-config\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-monitoring\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 09 15:04:21 crc kubenswrapper[4555]: W1209 15:04:21.989505 4555 reflector.go:561] object-"openshift-monitoring"/"node-exporter-dockercfg-txn95": failed to list *v1.Secret: secrets "node-exporter-dockercfg-txn95" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-monitoring": no relationship found between node 'crc' and this object Dec 09 15:04:21 crc kubenswrapper[4555]: E1209 15:04:21.989728 4555 reflector.go:158] "Unhandled Error" err="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-txn95\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"node-exporter-dockercfg-txn95\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-monitoring\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 09 15:04:21 crc kubenswrapper[4555]: W1209 15:04:21.989527 4555 reflector.go:561] object-"openshift-monitoring"/"node-exporter-tls": failed to list *v1.Secret: secrets "node-exporter-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-monitoring": no relationship found between node 'crc' and this object Dec 09 15:04:21 crc kubenswrapper[4555]: E1209 15:04:21.989745 4555 reflector.go:158] "Unhandled Error" err="object-\"openshift-monitoring\"/\"node-exporter-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"node-exporter-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-monitoring\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.002406 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-f6g6m"] Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.003282 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-f6g6m" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.007818 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.007912 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.008019 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-4d99s" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.013717 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-f6g6m"] Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.033194 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-hw7z2"] Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.034801 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-hw7z2" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.037695 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.037880 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.038159 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.042811 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-fzf5r" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.053128 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-hw7z2"] Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.056697 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aa2c82dc-2f5f-4749-a8b0-56f2fcc8f964-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-f6g6m\" (UID: \"aa2c82dc-2f5f-4749-a8b0-56f2fcc8f964\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-f6g6m" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.056747 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa2c82dc-2f5f-4749-a8b0-56f2fcc8f964-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-f6g6m\" (UID: \"aa2c82dc-2f5f-4749-a8b0-56f2fcc8f964\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-f6g6m" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.056787 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/95215982-c5fe-4f89-87f4-b0148f496f7c-node-exporter-wtmp\") pod \"node-exporter-v6jwh\" (UID: \"95215982-c5fe-4f89-87f4-b0148f496f7c\") " pod="openshift-monitoring/node-exporter-v6jwh" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.056808 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffx4b\" (UniqueName: \"kubernetes.io/projected/95215982-c5fe-4f89-87f4-b0148f496f7c-kube-api-access-ffx4b\") pod \"node-exporter-v6jwh\" (UID: \"95215982-c5fe-4f89-87f4-b0148f496f7c\") " pod="openshift-monitoring/node-exporter-v6jwh" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.056833 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/95215982-c5fe-4f89-87f4-b0148f496f7c-root\") pod \"node-exporter-v6jwh\" (UID: \"95215982-c5fe-4f89-87f4-b0148f496f7c\") " pod="openshift-monitoring/node-exporter-v6jwh" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.056854 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/95215982-c5fe-4f89-87f4-b0148f496f7c-sys\") pod \"node-exporter-v6jwh\" (UID: \"95215982-c5fe-4f89-87f4-b0148f496f7c\") " pod="openshift-monitoring/node-exporter-v6jwh" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.056872 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/95215982-c5fe-4f89-87f4-b0148f496f7c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v6jwh\" (UID: \"95215982-c5fe-4f89-87f4-b0148f496f7c\") " pod="openshift-monitoring/node-exporter-v6jwh" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.056890 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/95215982-c5fe-4f89-87f4-b0148f496f7c-metrics-client-ca\") pod \"node-exporter-v6jwh\" (UID: \"95215982-c5fe-4f89-87f4-b0148f496f7c\") " pod="openshift-monitoring/node-exporter-v6jwh" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.056905 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/95215982-c5fe-4f89-87f4-b0148f496f7c-node-exporter-tls\") pod \"node-exporter-v6jwh\" (UID: \"95215982-c5fe-4f89-87f4-b0148f496f7c\") " pod="openshift-monitoring/node-exporter-v6jwh" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.056928 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa2c82dc-2f5f-4749-a8b0-56f2fcc8f964-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-f6g6m\" (UID: \"aa2c82dc-2f5f-4749-a8b0-56f2fcc8f964\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-f6g6m" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.056942 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rbkz\" (UniqueName: \"kubernetes.io/projected/aa2c82dc-2f5f-4749-a8b0-56f2fcc8f964-kube-api-access-8rbkz\") pod \"openshift-state-metrics-566fddb674-f6g6m\" (UID: \"aa2c82dc-2f5f-4749-a8b0-56f2fcc8f964\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-f6g6m" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.056978 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/95215982-c5fe-4f89-87f4-b0148f496f7c-node-exporter-textfile\") pod \"node-exporter-v6jwh\" (UID: \"95215982-c5fe-4f89-87f4-b0148f496f7c\") " pod="openshift-monitoring/node-exporter-v6jwh" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.157825 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/95215982-c5fe-4f89-87f4-b0148f496f7c-sys\") pod \"node-exporter-v6jwh\" (UID: \"95215982-c5fe-4f89-87f4-b0148f496f7c\") " pod="openshift-monitoring/node-exporter-v6jwh" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.157865 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/95215982-c5fe-4f89-87f4-b0148f496f7c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v6jwh\" (UID: \"95215982-c5fe-4f89-87f4-b0148f496f7c\") " pod="openshift-monitoring/node-exporter-v6jwh" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.157889 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/95215982-c5fe-4f89-87f4-b0148f496f7c-metrics-client-ca\") pod \"node-exporter-v6jwh\" (UID: \"95215982-c5fe-4f89-87f4-b0148f496f7c\") " pod="openshift-monitoring/node-exporter-v6jwh" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.157908 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/95215982-c5fe-4f89-87f4-b0148f496f7c-node-exporter-tls\") pod \"node-exporter-v6jwh\" (UID: \"95215982-c5fe-4f89-87f4-b0148f496f7c\") " pod="openshift-monitoring/node-exporter-v6jwh" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.157930 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rbkz\" (UniqueName: \"kubernetes.io/projected/aa2c82dc-2f5f-4749-a8b0-56f2fcc8f964-kube-api-access-8rbkz\") pod \"openshift-state-metrics-566fddb674-f6g6m\" (UID: \"aa2c82dc-2f5f-4749-a8b0-56f2fcc8f964\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-f6g6m" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.157938 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/95215982-c5fe-4f89-87f4-b0148f496f7c-sys\") pod \"node-exporter-v6jwh\" (UID: \"95215982-c5fe-4f89-87f4-b0148f496f7c\") " pod="openshift-monitoring/node-exporter-v6jwh" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.157945 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa2c82dc-2f5f-4749-a8b0-56f2fcc8f964-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-f6g6m\" (UID: \"aa2c82dc-2f5f-4749-a8b0-56f2fcc8f964\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-f6g6m" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.157986 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7572ab14-4b5b-4f9d-91df-9e2efa7cc560-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-hw7z2\" (UID: \"7572ab14-4b5b-4f9d-91df-9e2efa7cc560\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-hw7z2" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.158019 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/95215982-c5fe-4f89-87f4-b0148f496f7c-node-exporter-textfile\") pod \"node-exporter-v6jwh\" (UID: \"95215982-c5fe-4f89-87f4-b0148f496f7c\") " pod="openshift-monitoring/node-exporter-v6jwh" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.158038 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7572ab14-4b5b-4f9d-91df-9e2efa7cc560-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-hw7z2\" (UID: \"7572ab14-4b5b-4f9d-91df-9e2efa7cc560\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-hw7z2" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.158068 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7572ab14-4b5b-4f9d-91df-9e2efa7cc560-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-hw7z2\" (UID: \"7572ab14-4b5b-4f9d-91df-9e2efa7cc560\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-hw7z2" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.158092 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aa2c82dc-2f5f-4749-a8b0-56f2fcc8f964-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-f6g6m\" (UID: \"aa2c82dc-2f5f-4749-a8b0-56f2fcc8f964\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-f6g6m" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.158108 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrdgc\" (UniqueName: \"kubernetes.io/projected/7572ab14-4b5b-4f9d-91df-9e2efa7cc560-kube-api-access-hrdgc\") pod \"kube-state-metrics-777cb5bd5d-hw7z2\" (UID: \"7572ab14-4b5b-4f9d-91df-9e2efa7cc560\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-hw7z2" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.158126 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7572ab14-4b5b-4f9d-91df-9e2efa7cc560-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-hw7z2\" (UID: \"7572ab14-4b5b-4f9d-91df-9e2efa7cc560\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-hw7z2" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.158149 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa2c82dc-2f5f-4749-a8b0-56f2fcc8f964-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-f6g6m\" (UID: \"aa2c82dc-2f5f-4749-a8b0-56f2fcc8f964\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-f6g6m" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.158168 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7572ab14-4b5b-4f9d-91df-9e2efa7cc560-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-hw7z2\" (UID: \"7572ab14-4b5b-4f9d-91df-9e2efa7cc560\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-hw7z2" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.158183 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/95215982-c5fe-4f89-87f4-b0148f496f7c-node-exporter-wtmp\") pod \"node-exporter-v6jwh\" (UID: \"95215982-c5fe-4f89-87f4-b0148f496f7c\") " pod="openshift-monitoring/node-exporter-v6jwh" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.158203 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffx4b\" (UniqueName: \"kubernetes.io/projected/95215982-c5fe-4f89-87f4-b0148f496f7c-kube-api-access-ffx4b\") pod \"node-exporter-v6jwh\" (UID: \"95215982-c5fe-4f89-87f4-b0148f496f7c\") " pod="openshift-monitoring/node-exporter-v6jwh" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.158224 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/95215982-c5fe-4f89-87f4-b0148f496f7c-root\") pod \"node-exporter-v6jwh\" (UID: \"95215982-c5fe-4f89-87f4-b0148f496f7c\") " pod="openshift-monitoring/node-exporter-v6jwh" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.158276 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/95215982-c5fe-4f89-87f4-b0148f496f7c-root\") pod \"node-exporter-v6jwh\" (UID: \"95215982-c5fe-4f89-87f4-b0148f496f7c\") " pod="openshift-monitoring/node-exporter-v6jwh" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.158404 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/95215982-c5fe-4f89-87f4-b0148f496f7c-node-exporter-wtmp\") pod \"node-exporter-v6jwh\" (UID: \"95215982-c5fe-4f89-87f4-b0148f496f7c\") " pod="openshift-monitoring/node-exporter-v6jwh" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.158405 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/95215982-c5fe-4f89-87f4-b0148f496f7c-node-exporter-textfile\") pod \"node-exporter-v6jwh\" (UID: \"95215982-c5fe-4f89-87f4-b0148f496f7c\") " pod="openshift-monitoring/node-exporter-v6jwh" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.158647 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/95215982-c5fe-4f89-87f4-b0148f496f7c-metrics-client-ca\") pod \"node-exporter-v6jwh\" (UID: \"95215982-c5fe-4f89-87f4-b0148f496f7c\") " pod="openshift-monitoring/node-exporter-v6jwh" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.159015 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa2c82dc-2f5f-4749-a8b0-56f2fcc8f964-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-f6g6m\" (UID: \"aa2c82dc-2f5f-4749-a8b0-56f2fcc8f964\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-f6g6m" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.164110 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aa2c82dc-2f5f-4749-a8b0-56f2fcc8f964-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-f6g6m\" (UID: \"aa2c82dc-2f5f-4749-a8b0-56f2fcc8f964\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-f6g6m" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.173246 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rbkz\" (UniqueName: \"kubernetes.io/projected/aa2c82dc-2f5f-4749-a8b0-56f2fcc8f964-kube-api-access-8rbkz\") pod \"openshift-state-metrics-566fddb674-f6g6m\" (UID: \"aa2c82dc-2f5f-4749-a8b0-56f2fcc8f964\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-f6g6m" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.174117 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffx4b\" (UniqueName: \"kubernetes.io/projected/95215982-c5fe-4f89-87f4-b0148f496f7c-kube-api-access-ffx4b\") pod \"node-exporter-v6jwh\" (UID: \"95215982-c5fe-4f89-87f4-b0148f496f7c\") " pod="openshift-monitoring/node-exporter-v6jwh" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.174866 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa2c82dc-2f5f-4749-a8b0-56f2fcc8f964-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-f6g6m\" (UID: \"aa2c82dc-2f5f-4749-a8b0-56f2fcc8f964\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-f6g6m" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.259445 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrdgc\" (UniqueName: \"kubernetes.io/projected/7572ab14-4b5b-4f9d-91df-9e2efa7cc560-kube-api-access-hrdgc\") pod \"kube-state-metrics-777cb5bd5d-hw7z2\" (UID: \"7572ab14-4b5b-4f9d-91df-9e2efa7cc560\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-hw7z2" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.259495 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7572ab14-4b5b-4f9d-91df-9e2efa7cc560-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-hw7z2\" (UID: \"7572ab14-4b5b-4f9d-91df-9e2efa7cc560\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-hw7z2" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.259535 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7572ab14-4b5b-4f9d-91df-9e2efa7cc560-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-hw7z2\" (UID: \"7572ab14-4b5b-4f9d-91df-9e2efa7cc560\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-hw7z2" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.259639 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7572ab14-4b5b-4f9d-91df-9e2efa7cc560-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-hw7z2\" (UID: \"7572ab14-4b5b-4f9d-91df-9e2efa7cc560\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-hw7z2" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.259682 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7572ab14-4b5b-4f9d-91df-9e2efa7cc560-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-hw7z2\" (UID: \"7572ab14-4b5b-4f9d-91df-9e2efa7cc560\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-hw7z2" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.259711 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7572ab14-4b5b-4f9d-91df-9e2efa7cc560-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-hw7z2\" (UID: \"7572ab14-4b5b-4f9d-91df-9e2efa7cc560\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-hw7z2" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.260001 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7572ab14-4b5b-4f9d-91df-9e2efa7cc560-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-hw7z2\" (UID: \"7572ab14-4b5b-4f9d-91df-9e2efa7cc560\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-hw7z2" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.260455 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7572ab14-4b5b-4f9d-91df-9e2efa7cc560-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-hw7z2\" (UID: \"7572ab14-4b5b-4f9d-91df-9e2efa7cc560\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-hw7z2" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.260494 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7572ab14-4b5b-4f9d-91df-9e2efa7cc560-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-hw7z2\" (UID: \"7572ab14-4b5b-4f9d-91df-9e2efa7cc560\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-hw7z2" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.262325 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7572ab14-4b5b-4f9d-91df-9e2efa7cc560-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-hw7z2\" (UID: \"7572ab14-4b5b-4f9d-91df-9e2efa7cc560\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-hw7z2" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.262715 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7572ab14-4b5b-4f9d-91df-9e2efa7cc560-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-hw7z2\" (UID: \"7572ab14-4b5b-4f9d-91df-9e2efa7cc560\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-hw7z2" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.276352 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrdgc\" (UniqueName: \"kubernetes.io/projected/7572ab14-4b5b-4f9d-91df-9e2efa7cc560-kube-api-access-hrdgc\") pod \"kube-state-metrics-777cb5bd5d-hw7z2\" (UID: \"7572ab14-4b5b-4f9d-91df-9e2efa7cc560\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-hw7z2" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.312872 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-f6g6m" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.349055 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-hw7z2" Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.661022 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-f6g6m"] Dec 09 15:04:22 crc kubenswrapper[4555]: W1209 15:04:22.664352 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa2c82dc_2f5f_4749_a8b0_56f2fcc8f964.slice/crio-eb003daead4b2b5986aab9b5330b585f0d92dbafd3c2e5beb5b27c0a7b063ee8 WatchSource:0}: Error finding container eb003daead4b2b5986aab9b5330b585f0d92dbafd3c2e5beb5b27c0a7b063ee8: Status 404 returned error can't find the container with id eb003daead4b2b5986aab9b5330b585f0d92dbafd3c2e5beb5b27c0a7b063ee8 Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.706832 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-hw7z2"] Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.854007 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-f6g6m" event={"ID":"aa2c82dc-2f5f-4749-a8b0-56f2fcc8f964","Type":"ContainerStarted","Data":"e2e1111a4a9a771a51da2eaf982cd9cda98fd5d4e325ac53f6c54514d6a6d072"} Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.854427 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-f6g6m" event={"ID":"aa2c82dc-2f5f-4749-a8b0-56f2fcc8f964","Type":"ContainerStarted","Data":"09d6d70273205d70278b88379efcabdc258e1a6bc59631dd05ec49376106ef91"} Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.854570 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-f6g6m" event={"ID":"aa2c82dc-2f5f-4749-a8b0-56f2fcc8f964","Type":"ContainerStarted","Data":"eb003daead4b2b5986aab9b5330b585f0d92dbafd3c2e5beb5b27c0a7b063ee8"} Dec 09 15:04:22 crc kubenswrapper[4555]: I1209 15:04:22.855044 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-hw7z2" event={"ID":"7572ab14-4b5b-4f9d-91df-9e2efa7cc560","Type":"ContainerStarted","Data":"c5e5a738204cffe67031a9445464e38e4ec0f55985648165c349a3640ea63f97"} Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.002119 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.013373 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/95215982-c5fe-4f89-87f4-b0148f496f7c-node-exporter-tls\") pod \"node-exporter-v6jwh\" (UID: \"95215982-c5fe-4f89-87f4-b0148f496f7c\") " pod="openshift-monitoring/node-exporter-v6jwh" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.014519 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.015972 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.017887 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.017913 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.018049 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.019745 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.020245 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-5whcm" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.020856 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.021346 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.021933 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.026809 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.030390 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.066914 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3c5d1674-62cb-43db-aed4-52d966df5339-config-out\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.066956 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3c5d1674-62cb-43db-aed4-52d966df5339-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.066978 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3c5d1674-62cb-43db-aed4-52d966df5339-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.067003 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3c5d1674-62cb-43db-aed4-52d966df5339-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.067170 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3c5d1674-62cb-43db-aed4-52d966df5339-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.067211 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c5d1674-62cb-43db-aed4-52d966df5339-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.067259 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3c5d1674-62cb-43db-aed4-52d966df5339-config-volume\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.067285 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3c5d1674-62cb-43db-aed4-52d966df5339-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.067333 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3c5d1674-62cb-43db-aed4-52d966df5339-web-config\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.067369 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3c5d1674-62cb-43db-aed4-52d966df5339-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.067384 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3c5d1674-62cb-43db-aed4-52d966df5339-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.067429 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr2kf\" (UniqueName: \"kubernetes.io/projected/3c5d1674-62cb-43db-aed4-52d966df5339-kube-api-access-cr2kf\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: E1209 15:04:23.158931 4555 secret.go:188] Couldn't get secret openshift-monitoring/node-exporter-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Dec 09 15:04:23 crc kubenswrapper[4555]: E1209 15:04:23.158978 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95215982-c5fe-4f89-87f4-b0148f496f7c-node-exporter-kube-rbac-proxy-config podName:95215982-c5fe-4f89-87f4-b0148f496f7c nodeName:}" failed. No retries permitted until 2025-12-09 15:04:23.65896514 +0000 UTC m=+373.795677858 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/95215982-c5fe-4f89-87f4-b0148f496f7c-node-exporter-kube-rbac-proxy-config") pod "node-exporter-v6jwh" (UID: "95215982-c5fe-4f89-87f4-b0148f496f7c") : failed to sync secret cache: timed out waiting for the condition Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.168711 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3c5d1674-62cb-43db-aed4-52d966df5339-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.168746 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3c5d1674-62cb-43db-aed4-52d966df5339-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.168778 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3c5d1674-62cb-43db-aed4-52d966df5339-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.168821 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3c5d1674-62cb-43db-aed4-52d966df5339-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.168841 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c5d1674-62cb-43db-aed4-52d966df5339-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.168865 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3c5d1674-62cb-43db-aed4-52d966df5339-config-volume\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.168894 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3c5d1674-62cb-43db-aed4-52d966df5339-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.168917 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3c5d1674-62cb-43db-aed4-52d966df5339-web-config\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.168946 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3c5d1674-62cb-43db-aed4-52d966df5339-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.168963 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3c5d1674-62cb-43db-aed4-52d966df5339-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.168988 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr2kf\" (UniqueName: \"kubernetes.io/projected/3c5d1674-62cb-43db-aed4-52d966df5339-kube-api-access-cr2kf\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.169008 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3c5d1674-62cb-43db-aed4-52d966df5339-config-out\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.169646 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3c5d1674-62cb-43db-aed4-52d966df5339-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.169962 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3c5d1674-62cb-43db-aed4-52d966df5339-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.170034 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c5d1674-62cb-43db-aed4-52d966df5339-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.171702 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3c5d1674-62cb-43db-aed4-52d966df5339-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.171783 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3c5d1674-62cb-43db-aed4-52d966df5339-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.175624 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3c5d1674-62cb-43db-aed4-52d966df5339-config-volume\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.175779 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3c5d1674-62cb-43db-aed4-52d966df5339-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.175819 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3c5d1674-62cb-43db-aed4-52d966df5339-web-config\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.175841 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3c5d1674-62cb-43db-aed4-52d966df5339-config-out\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.176235 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3c5d1674-62cb-43db-aed4-52d966df5339-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.176827 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3c5d1674-62cb-43db-aed4-52d966df5339-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.186216 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr2kf\" (UniqueName: \"kubernetes.io/projected/3c5d1674-62cb-43db-aed4-52d966df5339-kube-api-access-cr2kf\") pod \"alertmanager-main-0\" (UID: \"3c5d1674-62cb-43db-aed4-52d966df5339\") " pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.212119 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.326579 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.528498 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-txn95" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.674966 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/95215982-c5fe-4f89-87f4-b0148f496f7c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v6jwh\" (UID: \"95215982-c5fe-4f89-87f4-b0148f496f7c\") " pod="openshift-monitoring/node-exporter-v6jwh" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.679979 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/95215982-c5fe-4f89-87f4-b0148f496f7c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v6jwh\" (UID: \"95215982-c5fe-4f89-87f4-b0148f496f7c\") " pod="openshift-monitoring/node-exporter-v6jwh" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.699206 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.795584 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-v6jwh" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.869693 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3c5d1674-62cb-43db-aed4-52d966df5339","Type":"ContainerStarted","Data":"37b7cce3435eccbfb2ffff28640e9d655185e7c612507060c36a2e5cf064cdfd"} Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.922929 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-868cc685f5-brvwb"] Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.925422 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.929859 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-4rngv" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.930036 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.930153 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.930474 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.930594 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.930697 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-1s4leutg4hber" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.930821 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.931888 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-868cc685f5-brvwb"] Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.978747 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlg6d\" (UniqueName: \"kubernetes.io/projected/a95a9cbd-dcef-4b15-96a0-2efd23287563-kube-api-access-qlg6d\") pod \"thanos-querier-868cc685f5-brvwb\" (UID: \"a95a9cbd-dcef-4b15-96a0-2efd23287563\") " pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.979036 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a95a9cbd-dcef-4b15-96a0-2efd23287563-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-868cc685f5-brvwb\" (UID: \"a95a9cbd-dcef-4b15-96a0-2efd23287563\") " pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.979151 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a95a9cbd-dcef-4b15-96a0-2efd23287563-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-868cc685f5-brvwb\" (UID: \"a95a9cbd-dcef-4b15-96a0-2efd23287563\") " pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.979202 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a95a9cbd-dcef-4b15-96a0-2efd23287563-metrics-client-ca\") pod \"thanos-querier-868cc685f5-brvwb\" (UID: \"a95a9cbd-dcef-4b15-96a0-2efd23287563\") " pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.979231 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a95a9cbd-dcef-4b15-96a0-2efd23287563-secret-thanos-querier-tls\") pod \"thanos-querier-868cc685f5-brvwb\" (UID: \"a95a9cbd-dcef-4b15-96a0-2efd23287563\") " pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.979285 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a95a9cbd-dcef-4b15-96a0-2efd23287563-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-868cc685f5-brvwb\" (UID: \"a95a9cbd-dcef-4b15-96a0-2efd23287563\") " pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.979342 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a95a9cbd-dcef-4b15-96a0-2efd23287563-secret-grpc-tls\") pod \"thanos-querier-868cc685f5-brvwb\" (UID: \"a95a9cbd-dcef-4b15-96a0-2efd23287563\") " pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" Dec 09 15:04:23 crc kubenswrapper[4555]: I1209 15:04:23.979369 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a95a9cbd-dcef-4b15-96a0-2efd23287563-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-868cc685f5-brvwb\" (UID: \"a95a9cbd-dcef-4b15-96a0-2efd23287563\") " pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" Dec 09 15:04:24 crc kubenswrapper[4555]: I1209 15:04:24.080431 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a95a9cbd-dcef-4b15-96a0-2efd23287563-secret-thanos-querier-tls\") pod \"thanos-querier-868cc685f5-brvwb\" (UID: \"a95a9cbd-dcef-4b15-96a0-2efd23287563\") " pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" Dec 09 15:04:24 crc kubenswrapper[4555]: I1209 15:04:24.080564 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a95a9cbd-dcef-4b15-96a0-2efd23287563-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-868cc685f5-brvwb\" (UID: \"a95a9cbd-dcef-4b15-96a0-2efd23287563\") " pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" Dec 09 15:04:24 crc kubenswrapper[4555]: I1209 15:04:24.080605 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a95a9cbd-dcef-4b15-96a0-2efd23287563-secret-grpc-tls\") pod \"thanos-querier-868cc685f5-brvwb\" (UID: \"a95a9cbd-dcef-4b15-96a0-2efd23287563\") " pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" Dec 09 15:04:24 crc kubenswrapper[4555]: I1209 15:04:24.080633 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a95a9cbd-dcef-4b15-96a0-2efd23287563-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-868cc685f5-brvwb\" (UID: \"a95a9cbd-dcef-4b15-96a0-2efd23287563\") " pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" Dec 09 15:04:24 crc kubenswrapper[4555]: I1209 15:04:24.080687 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlg6d\" (UniqueName: \"kubernetes.io/projected/a95a9cbd-dcef-4b15-96a0-2efd23287563-kube-api-access-qlg6d\") pod \"thanos-querier-868cc685f5-brvwb\" (UID: \"a95a9cbd-dcef-4b15-96a0-2efd23287563\") " pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" Dec 09 15:04:24 crc kubenswrapper[4555]: I1209 15:04:24.080722 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a95a9cbd-dcef-4b15-96a0-2efd23287563-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-868cc685f5-brvwb\" (UID: \"a95a9cbd-dcef-4b15-96a0-2efd23287563\") " pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" Dec 09 15:04:24 crc kubenswrapper[4555]: I1209 15:04:24.080739 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a95a9cbd-dcef-4b15-96a0-2efd23287563-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-868cc685f5-brvwb\" (UID: \"a95a9cbd-dcef-4b15-96a0-2efd23287563\") " pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" Dec 09 15:04:24 crc kubenswrapper[4555]: I1209 15:04:24.080782 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a95a9cbd-dcef-4b15-96a0-2efd23287563-metrics-client-ca\") pod \"thanos-querier-868cc685f5-brvwb\" (UID: \"a95a9cbd-dcef-4b15-96a0-2efd23287563\") " pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" Dec 09 15:04:24 crc kubenswrapper[4555]: I1209 15:04:24.081941 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a95a9cbd-dcef-4b15-96a0-2efd23287563-metrics-client-ca\") pod \"thanos-querier-868cc685f5-brvwb\" (UID: \"a95a9cbd-dcef-4b15-96a0-2efd23287563\") " pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" Dec 09 15:04:24 crc kubenswrapper[4555]: I1209 15:04:24.083535 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a95a9cbd-dcef-4b15-96a0-2efd23287563-secret-thanos-querier-tls\") pod \"thanos-querier-868cc685f5-brvwb\" (UID: \"a95a9cbd-dcef-4b15-96a0-2efd23287563\") " pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" Dec 09 15:04:24 crc kubenswrapper[4555]: I1209 15:04:24.084020 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a95a9cbd-dcef-4b15-96a0-2efd23287563-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-868cc685f5-brvwb\" (UID: \"a95a9cbd-dcef-4b15-96a0-2efd23287563\") " pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" Dec 09 15:04:24 crc kubenswrapper[4555]: I1209 15:04:24.084507 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a95a9cbd-dcef-4b15-96a0-2efd23287563-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-868cc685f5-brvwb\" (UID: \"a95a9cbd-dcef-4b15-96a0-2efd23287563\") " pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" Dec 09 15:04:24 crc kubenswrapper[4555]: I1209 15:04:24.084927 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a95a9cbd-dcef-4b15-96a0-2efd23287563-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-868cc685f5-brvwb\" (UID: \"a95a9cbd-dcef-4b15-96a0-2efd23287563\") " pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" Dec 09 15:04:24 crc kubenswrapper[4555]: I1209 15:04:24.085006 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a95a9cbd-dcef-4b15-96a0-2efd23287563-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-868cc685f5-brvwb\" (UID: \"a95a9cbd-dcef-4b15-96a0-2efd23287563\") " pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" Dec 09 15:04:24 crc kubenswrapper[4555]: I1209 15:04:24.085563 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a95a9cbd-dcef-4b15-96a0-2efd23287563-secret-grpc-tls\") pod \"thanos-querier-868cc685f5-brvwb\" (UID: \"a95a9cbd-dcef-4b15-96a0-2efd23287563\") " pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" Dec 09 15:04:24 crc kubenswrapper[4555]: I1209 15:04:24.094014 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlg6d\" (UniqueName: \"kubernetes.io/projected/a95a9cbd-dcef-4b15-96a0-2efd23287563-kube-api-access-qlg6d\") pod \"thanos-querier-868cc685f5-brvwb\" (UID: \"a95a9cbd-dcef-4b15-96a0-2efd23287563\") " pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" Dec 09 15:04:24 crc kubenswrapper[4555]: I1209 15:04:24.238746 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" Dec 09 15:04:24 crc kubenswrapper[4555]: W1209 15:04:24.313498 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95215982_c5fe_4f89_87f4_b0148f496f7c.slice/crio-764527ca7977dc3e6007fc883449ccdeba5c0084797131622d9ca9aa931f5a59 WatchSource:0}: Error finding container 764527ca7977dc3e6007fc883449ccdeba5c0084797131622d9ca9aa931f5a59: Status 404 returned error can't find the container with id 764527ca7977dc3e6007fc883449ccdeba5c0084797131622d9ca9aa931f5a59 Dec 09 15:04:24 crc kubenswrapper[4555]: I1209 15:04:24.649031 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-868cc685f5-brvwb"] Dec 09 15:04:24 crc kubenswrapper[4555]: W1209 15:04:24.652944 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda95a9cbd_dcef_4b15_96a0_2efd23287563.slice/crio-1a1a34c848006fe84758f9eb998f10add26ea72cf0a304632d59ce44a4146f48 WatchSource:0}: Error finding container 1a1a34c848006fe84758f9eb998f10add26ea72cf0a304632d59ce44a4146f48: Status 404 returned error can't find the container with id 1a1a34c848006fe84758f9eb998f10add26ea72cf0a304632d59ce44a4146f48 Dec 09 15:04:24 crc kubenswrapper[4555]: I1209 15:04:24.875674 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-hw7z2" event={"ID":"7572ab14-4b5b-4f9d-91df-9e2efa7cc560","Type":"ContainerStarted","Data":"d6f1bc612d53735900bc2564aa74d0f505518360a760ccd57cc73b0325be506e"} Dec 09 15:04:24 crc kubenswrapper[4555]: I1209 15:04:24.875740 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-hw7z2" event={"ID":"7572ab14-4b5b-4f9d-91df-9e2efa7cc560","Type":"ContainerStarted","Data":"9581a50fabb6d9efab79edb24d3f23bd471fb5e8745eec0a726c1c4fe656f6d2"} Dec 09 15:04:24 crc kubenswrapper[4555]: I1209 15:04:24.875752 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-hw7z2" event={"ID":"7572ab14-4b5b-4f9d-91df-9e2efa7cc560","Type":"ContainerStarted","Data":"067aa236003b3a24036e12635e275c6f734d5a820a8ae575fc09f80643341615"} Dec 09 15:04:24 crc kubenswrapper[4555]: I1209 15:04:24.878917 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" event={"ID":"a95a9cbd-dcef-4b15-96a0-2efd23287563","Type":"ContainerStarted","Data":"1a1a34c848006fe84758f9eb998f10add26ea72cf0a304632d59ce44a4146f48"} Dec 09 15:04:24 crc kubenswrapper[4555]: I1209 15:04:24.886704 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-f6g6m" event={"ID":"aa2c82dc-2f5f-4749-a8b0-56f2fcc8f964","Type":"ContainerStarted","Data":"c0be640204b3abf75c379929c4f98e07076d5dbe0d85b04051343fac1520c953"} Dec 09 15:04:24 crc kubenswrapper[4555]: I1209 15:04:24.891248 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-hw7z2" podStartSLOduration=2.295992373 podStartE2EDuration="3.891236309s" podCreationTimestamp="2025-12-09 15:04:21 +0000 UTC" firstStartedPulling="2025-12-09 15:04:22.712903506 +0000 UTC m=+372.849616223" lastFinishedPulling="2025-12-09 15:04:24.308147441 +0000 UTC m=+374.444860159" observedRunningTime="2025-12-09 15:04:24.889000088 +0000 UTC m=+375.025712806" watchObservedRunningTime="2025-12-09 15:04:24.891236309 +0000 UTC m=+375.027949016" Dec 09 15:04:24 crc kubenswrapper[4555]: I1209 15:04:24.893319 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v6jwh" event={"ID":"95215982-c5fe-4f89-87f4-b0148f496f7c","Type":"ContainerStarted","Data":"764527ca7977dc3e6007fc883449ccdeba5c0084797131622d9ca9aa931f5a59"} Dec 09 15:04:24 crc kubenswrapper[4555]: I1209 15:04:24.904068 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-566fddb674-f6g6m" podStartSLOduration=2.411219413 podStartE2EDuration="3.904059246s" podCreationTimestamp="2025-12-09 15:04:21 +0000 UTC" firstStartedPulling="2025-12-09 15:04:22.819246986 +0000 UTC m=+372.955959704" lastFinishedPulling="2025-12-09 15:04:24.312086818 +0000 UTC m=+374.448799537" observedRunningTime="2025-12-09 15:04:24.900517019 +0000 UTC m=+375.037229736" watchObservedRunningTime="2025-12-09 15:04:24.904059246 +0000 UTC m=+375.040771964" Dec 09 15:04:25 crc kubenswrapper[4555]: I1209 15:04:25.531428 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:04:25 crc kubenswrapper[4555]: I1209 15:04:25.531712 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:04:25 crc kubenswrapper[4555]: I1209 15:04:25.899058 4555 generic.go:334] "Generic (PLEG): container finished" podID="3c5d1674-62cb-43db-aed4-52d966df5339" containerID="780f3d6000f7f07325b886309f1843cd7fd9caed6290a8e67e21534a2d6aacc1" exitCode=0 Dec 09 15:04:25 crc kubenswrapper[4555]: I1209 15:04:25.899181 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3c5d1674-62cb-43db-aed4-52d966df5339","Type":"ContainerDied","Data":"780f3d6000f7f07325b886309f1843cd7fd9caed6290a8e67e21534a2d6aacc1"} Dec 09 15:04:25 crc kubenswrapper[4555]: I1209 15:04:25.901863 4555 generic.go:334] "Generic (PLEG): container finished" podID="95215982-c5fe-4f89-87f4-b0148f496f7c" containerID="ffd76138b1564a69bec86564029808ecabedda65ee2c414da1c0b623606a5fef" exitCode=0 Dec 09 15:04:25 crc kubenswrapper[4555]: I1209 15:04:25.901916 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v6jwh" event={"ID":"95215982-c5fe-4f89-87f4-b0148f496f7c","Type":"ContainerDied","Data":"ffd76138b1564a69bec86564029808ecabedda65ee2c414da1c0b623606a5fef"} Dec 09 15:04:26 crc kubenswrapper[4555]: I1209 15:04:26.809860 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-c7f86c659-l8nlg"] Dec 09 15:04:26 crc kubenswrapper[4555]: I1209 15:04:26.811661 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c7f86c659-l8nlg" Dec 09 15:04:26 crc kubenswrapper[4555]: I1209 15:04:26.824087 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c7f86c659-l8nlg"] Dec 09 15:04:26 crc kubenswrapper[4555]: I1209 15:04:26.913690 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" event={"ID":"a95a9cbd-dcef-4b15-96a0-2efd23287563","Type":"ContainerStarted","Data":"a6cd340fa883eecffa1699713a084d47751f2161571d9c7846750c19985230cc"} Dec 09 15:04:26 crc kubenswrapper[4555]: I1209 15:04:26.913775 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" event={"ID":"a95a9cbd-dcef-4b15-96a0-2efd23287563","Type":"ContainerStarted","Data":"bcc7d6b8fb645bb83fb6fbb45723151db0f742c91c51636c8a33e685fa2d3746"} Dec 09 15:04:26 crc kubenswrapper[4555]: I1209 15:04:26.916816 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v6jwh" event={"ID":"95215982-c5fe-4f89-87f4-b0148f496f7c","Type":"ContainerStarted","Data":"888606f1af57b993af3a0498d3a8b5ac8e2f9a92cb85f5af63a101255ae73530"} Dec 09 15:04:26 crc kubenswrapper[4555]: I1209 15:04:26.916868 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v6jwh" event={"ID":"95215982-c5fe-4f89-87f4-b0148f496f7c","Type":"ContainerStarted","Data":"047a92e1b20e233148b5d50c9dfb689da2fd6a6017c759d6e0c2ed3674d093ac"} Dec 09 15:04:26 crc kubenswrapper[4555]: I1209 15:04:26.923093 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b7d7\" (UniqueName: \"kubernetes.io/projected/eef1d965-488b-465b-bc5b-ce417339682b-kube-api-access-7b7d7\") pod \"console-c7f86c659-l8nlg\" (UID: \"eef1d965-488b-465b-bc5b-ce417339682b\") " pod="openshift-console/console-c7f86c659-l8nlg" Dec 09 15:04:26 crc kubenswrapper[4555]: I1209 15:04:26.923141 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eef1d965-488b-465b-bc5b-ce417339682b-console-oauth-config\") pod \"console-c7f86c659-l8nlg\" (UID: \"eef1d965-488b-465b-bc5b-ce417339682b\") " pod="openshift-console/console-c7f86c659-l8nlg" Dec 09 15:04:26 crc kubenswrapper[4555]: I1209 15:04:26.923232 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eef1d965-488b-465b-bc5b-ce417339682b-console-config\") pod \"console-c7f86c659-l8nlg\" (UID: \"eef1d965-488b-465b-bc5b-ce417339682b\") " pod="openshift-console/console-c7f86c659-l8nlg" Dec 09 15:04:26 crc kubenswrapper[4555]: I1209 15:04:26.923277 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eef1d965-488b-465b-bc5b-ce417339682b-trusted-ca-bundle\") pod \"console-c7f86c659-l8nlg\" (UID: \"eef1d965-488b-465b-bc5b-ce417339682b\") " pod="openshift-console/console-c7f86c659-l8nlg" Dec 09 15:04:26 crc kubenswrapper[4555]: I1209 15:04:26.923298 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eef1d965-488b-465b-bc5b-ce417339682b-oauth-serving-cert\") pod \"console-c7f86c659-l8nlg\" (UID: \"eef1d965-488b-465b-bc5b-ce417339682b\") " pod="openshift-console/console-c7f86c659-l8nlg" Dec 09 15:04:26 crc kubenswrapper[4555]: I1209 15:04:26.923333 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eef1d965-488b-465b-bc5b-ce417339682b-console-serving-cert\") pod \"console-c7f86c659-l8nlg\" (UID: \"eef1d965-488b-465b-bc5b-ce417339682b\") " pod="openshift-console/console-c7f86c659-l8nlg" Dec 09 15:04:26 crc kubenswrapper[4555]: I1209 15:04:26.923494 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eef1d965-488b-465b-bc5b-ce417339682b-service-ca\") pod \"console-c7f86c659-l8nlg\" (UID: \"eef1d965-488b-465b-bc5b-ce417339682b\") " pod="openshift-console/console-c7f86c659-l8nlg" Dec 09 15:04:26 crc kubenswrapper[4555]: I1209 15:04:26.939890 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-v6jwh" podStartSLOduration=4.782103065 podStartE2EDuration="5.939862999s" podCreationTimestamp="2025-12-09 15:04:21 +0000 UTC" firstStartedPulling="2025-12-09 15:04:24.316073163 +0000 UTC m=+374.452785881" lastFinishedPulling="2025-12-09 15:04:25.473833097 +0000 UTC m=+375.610545815" observedRunningTime="2025-12-09 15:04:26.934554578 +0000 UTC m=+377.071267296" watchObservedRunningTime="2025-12-09 15:04:26.939862999 +0000 UTC m=+377.076575716" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.025494 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b7d7\" (UniqueName: \"kubernetes.io/projected/eef1d965-488b-465b-bc5b-ce417339682b-kube-api-access-7b7d7\") pod \"console-c7f86c659-l8nlg\" (UID: \"eef1d965-488b-465b-bc5b-ce417339682b\") " pod="openshift-console/console-c7f86c659-l8nlg" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.025551 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eef1d965-488b-465b-bc5b-ce417339682b-console-oauth-config\") pod \"console-c7f86c659-l8nlg\" (UID: \"eef1d965-488b-465b-bc5b-ce417339682b\") " pod="openshift-console/console-c7f86c659-l8nlg" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.025727 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eef1d965-488b-465b-bc5b-ce417339682b-console-config\") pod \"console-c7f86c659-l8nlg\" (UID: \"eef1d965-488b-465b-bc5b-ce417339682b\") " pod="openshift-console/console-c7f86c659-l8nlg" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.025813 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eef1d965-488b-465b-bc5b-ce417339682b-trusted-ca-bundle\") pod \"console-c7f86c659-l8nlg\" (UID: \"eef1d965-488b-465b-bc5b-ce417339682b\") " pod="openshift-console/console-c7f86c659-l8nlg" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.025860 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eef1d965-488b-465b-bc5b-ce417339682b-oauth-serving-cert\") pod \"console-c7f86c659-l8nlg\" (UID: \"eef1d965-488b-465b-bc5b-ce417339682b\") " pod="openshift-console/console-c7f86c659-l8nlg" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.025884 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eef1d965-488b-465b-bc5b-ce417339682b-console-serving-cert\") pod \"console-c7f86c659-l8nlg\" (UID: \"eef1d965-488b-465b-bc5b-ce417339682b\") " pod="openshift-console/console-c7f86c659-l8nlg" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.025930 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eef1d965-488b-465b-bc5b-ce417339682b-service-ca\") pod \"console-c7f86c659-l8nlg\" (UID: \"eef1d965-488b-465b-bc5b-ce417339682b\") " pod="openshift-console/console-c7f86c659-l8nlg" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.026907 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eef1d965-488b-465b-bc5b-ce417339682b-service-ca\") pod \"console-c7f86c659-l8nlg\" (UID: \"eef1d965-488b-465b-bc5b-ce417339682b\") " pod="openshift-console/console-c7f86c659-l8nlg" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.027601 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eef1d965-488b-465b-bc5b-ce417339682b-console-config\") pod \"console-c7f86c659-l8nlg\" (UID: \"eef1d965-488b-465b-bc5b-ce417339682b\") " pod="openshift-console/console-c7f86c659-l8nlg" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.027609 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eef1d965-488b-465b-bc5b-ce417339682b-oauth-serving-cert\") pod \"console-c7f86c659-l8nlg\" (UID: \"eef1d965-488b-465b-bc5b-ce417339682b\") " pod="openshift-console/console-c7f86c659-l8nlg" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.028021 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eef1d965-488b-465b-bc5b-ce417339682b-trusted-ca-bundle\") pod \"console-c7f86c659-l8nlg\" (UID: \"eef1d965-488b-465b-bc5b-ce417339682b\") " pod="openshift-console/console-c7f86c659-l8nlg" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.031508 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eef1d965-488b-465b-bc5b-ce417339682b-console-serving-cert\") pod \"console-c7f86c659-l8nlg\" (UID: \"eef1d965-488b-465b-bc5b-ce417339682b\") " pod="openshift-console/console-c7f86c659-l8nlg" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.032276 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eef1d965-488b-465b-bc5b-ce417339682b-console-oauth-config\") pod \"console-c7f86c659-l8nlg\" (UID: \"eef1d965-488b-465b-bc5b-ce417339682b\") " pod="openshift-console/console-c7f86c659-l8nlg" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.044805 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b7d7\" (UniqueName: \"kubernetes.io/projected/eef1d965-488b-465b-bc5b-ce417339682b-kube-api-access-7b7d7\") pod \"console-c7f86c659-l8nlg\" (UID: \"eef1d965-488b-465b-bc5b-ce417339682b\") " pod="openshift-console/console-c7f86c659-l8nlg" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.123793 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c7f86c659-l8nlg" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.349818 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5"] Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.350700 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.352338 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.354795 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.354855 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.355218 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-j4fzr" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.355352 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-e9lb5emsmp1j6" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.359193 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.370026 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5"] Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.431491 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f2ca5e1b-022a-458a-ac63-7a66f3676329-metrics-server-audit-profiles\") pod \"metrics-server-7c88b9f8fd-2lsx5\" (UID: \"f2ca5e1b-022a-458a-ac63-7a66f3676329\") " pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.431552 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f2ca5e1b-022a-458a-ac63-7a66f3676329-secret-metrics-client-certs\") pod \"metrics-server-7c88b9f8fd-2lsx5\" (UID: \"f2ca5e1b-022a-458a-ac63-7a66f3676329\") " pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.431584 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2ca5e1b-022a-458a-ac63-7a66f3676329-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7c88b9f8fd-2lsx5\" (UID: \"f2ca5e1b-022a-458a-ac63-7a66f3676329\") " pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.431655 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ca5e1b-022a-458a-ac63-7a66f3676329-client-ca-bundle\") pod \"metrics-server-7c88b9f8fd-2lsx5\" (UID: \"f2ca5e1b-022a-458a-ac63-7a66f3676329\") " pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.431728 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wczqq\" (UniqueName: \"kubernetes.io/projected/f2ca5e1b-022a-458a-ac63-7a66f3676329-kube-api-access-wczqq\") pod \"metrics-server-7c88b9f8fd-2lsx5\" (UID: \"f2ca5e1b-022a-458a-ac63-7a66f3676329\") " pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.431818 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f2ca5e1b-022a-458a-ac63-7a66f3676329-audit-log\") pod \"metrics-server-7c88b9f8fd-2lsx5\" (UID: \"f2ca5e1b-022a-458a-ac63-7a66f3676329\") " pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.431842 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f2ca5e1b-022a-458a-ac63-7a66f3676329-secret-metrics-server-tls\") pod \"metrics-server-7c88b9f8fd-2lsx5\" (UID: \"f2ca5e1b-022a-458a-ac63-7a66f3676329\") " pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.457508 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c7f86c659-l8nlg"] Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.533450 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ca5e1b-022a-458a-ac63-7a66f3676329-client-ca-bundle\") pod \"metrics-server-7c88b9f8fd-2lsx5\" (UID: \"f2ca5e1b-022a-458a-ac63-7a66f3676329\") " pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.533503 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wczqq\" (UniqueName: \"kubernetes.io/projected/f2ca5e1b-022a-458a-ac63-7a66f3676329-kube-api-access-wczqq\") pod \"metrics-server-7c88b9f8fd-2lsx5\" (UID: \"f2ca5e1b-022a-458a-ac63-7a66f3676329\") " pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.533540 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f2ca5e1b-022a-458a-ac63-7a66f3676329-audit-log\") pod \"metrics-server-7c88b9f8fd-2lsx5\" (UID: \"f2ca5e1b-022a-458a-ac63-7a66f3676329\") " pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.533560 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f2ca5e1b-022a-458a-ac63-7a66f3676329-secret-metrics-server-tls\") pod \"metrics-server-7c88b9f8fd-2lsx5\" (UID: \"f2ca5e1b-022a-458a-ac63-7a66f3676329\") " pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.533608 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f2ca5e1b-022a-458a-ac63-7a66f3676329-metrics-server-audit-profiles\") pod \"metrics-server-7c88b9f8fd-2lsx5\" (UID: \"f2ca5e1b-022a-458a-ac63-7a66f3676329\") " pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.533657 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f2ca5e1b-022a-458a-ac63-7a66f3676329-secret-metrics-client-certs\") pod \"metrics-server-7c88b9f8fd-2lsx5\" (UID: \"f2ca5e1b-022a-458a-ac63-7a66f3676329\") " pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.533683 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2ca5e1b-022a-458a-ac63-7a66f3676329-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7c88b9f8fd-2lsx5\" (UID: \"f2ca5e1b-022a-458a-ac63-7a66f3676329\") " pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.534494 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f2ca5e1b-022a-458a-ac63-7a66f3676329-audit-log\") pod \"metrics-server-7c88b9f8fd-2lsx5\" (UID: \"f2ca5e1b-022a-458a-ac63-7a66f3676329\") " pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.534502 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2ca5e1b-022a-458a-ac63-7a66f3676329-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7c88b9f8fd-2lsx5\" (UID: \"f2ca5e1b-022a-458a-ac63-7a66f3676329\") " pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.535447 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f2ca5e1b-022a-458a-ac63-7a66f3676329-metrics-server-audit-profiles\") pod \"metrics-server-7c88b9f8fd-2lsx5\" (UID: \"f2ca5e1b-022a-458a-ac63-7a66f3676329\") " pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.539908 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f2ca5e1b-022a-458a-ac63-7a66f3676329-secret-metrics-server-tls\") pod \"metrics-server-7c88b9f8fd-2lsx5\" (UID: \"f2ca5e1b-022a-458a-ac63-7a66f3676329\") " pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.539946 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f2ca5e1b-022a-458a-ac63-7a66f3676329-secret-metrics-client-certs\") pod \"metrics-server-7c88b9f8fd-2lsx5\" (UID: \"f2ca5e1b-022a-458a-ac63-7a66f3676329\") " pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.540005 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ca5e1b-022a-458a-ac63-7a66f3676329-client-ca-bundle\") pod \"metrics-server-7c88b9f8fd-2lsx5\" (UID: \"f2ca5e1b-022a-458a-ac63-7a66f3676329\") " pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" Dec 09 15:04:27 crc kubenswrapper[4555]: I1209 15:04:27.547509 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wczqq\" (UniqueName: \"kubernetes.io/projected/f2ca5e1b-022a-458a-ac63-7a66f3676329-kube-api-access-wczqq\") pod \"metrics-server-7c88b9f8fd-2lsx5\" (UID: \"f2ca5e1b-022a-458a-ac63-7a66f3676329\") " pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" Dec 09 15:04:27 crc kubenswrapper[4555]: W1209 15:04:27.634084 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeef1d965_488b_465b_bc5b_ce417339682b.slice/crio-40e5a9ff883fe765e824450a97569f1e385407c6e7c314475c9958758dbb7a8f WatchSource:0}: Error finding container 40e5a9ff883fe765e824450a97569f1e385407c6e7c314475c9958758dbb7a8f: Status 404 returned error can't find the container with id 40e5a9ff883fe765e824450a97569f1e385407c6e7c314475c9958758dbb7a8f Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:27.667888 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:27.738466 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-b4bd4568d-b9t26"] Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:27.740648 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-b4bd4568d-b9t26" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:27.742364 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-6tstp" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:27.745292 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-b4bd4568d-b9t26"] Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:27.747026 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:27.842473 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e2b2890b-92ed-42cc-b58b-8e04f86efd43-monitoring-plugin-cert\") pod \"monitoring-plugin-b4bd4568d-b9t26\" (UID: \"e2b2890b-92ed-42cc-b58b-8e04f86efd43\") " pod="openshift-monitoring/monitoring-plugin-b4bd4568d-b9t26" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:27.928274 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3c5d1674-62cb-43db-aed4-52d966df5339","Type":"ContainerStarted","Data":"4710689a53ec3b4c8bff76e209477f33b185389aed1c05e1d0c27ed3a7686b9e"} Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:27.929373 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c7f86c659-l8nlg" event={"ID":"eef1d965-488b-465b-bc5b-ce417339682b","Type":"ContainerStarted","Data":"1865c132d3fb31fe70531d6c03bf6150c6ae1febaff26ecc903c75d3884e978f"} Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:27.929393 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c7f86c659-l8nlg" event={"ID":"eef1d965-488b-465b-bc5b-ce417339682b","Type":"ContainerStarted","Data":"40e5a9ff883fe765e824450a97569f1e385407c6e7c314475c9958758dbb7a8f"} Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:27.931197 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" event={"ID":"a95a9cbd-dcef-4b15-96a0-2efd23287563","Type":"ContainerStarted","Data":"71f5f0b0b233d0597823004d5bf4133a7d48106f34b2124577072cf2b8b20285"} Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:27.943748 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c7f86c659-l8nlg" podStartSLOduration=1.9437328759999999 podStartE2EDuration="1.943732876s" podCreationTimestamp="2025-12-09 15:04:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:04:27.941889226 +0000 UTC m=+378.078601944" watchObservedRunningTime="2025-12-09 15:04:27.943732876 +0000 UTC m=+378.080445594" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:27.944835 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e2b2890b-92ed-42cc-b58b-8e04f86efd43-monitoring-plugin-cert\") pod \"monitoring-plugin-b4bd4568d-b9t26\" (UID: \"e2b2890b-92ed-42cc-b58b-8e04f86efd43\") " pod="openshift-monitoring/monitoring-plugin-b4bd4568d-b9t26" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:27.951173 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e2b2890b-92ed-42cc-b58b-8e04f86efd43-monitoring-plugin-cert\") pod \"monitoring-plugin-b4bd4568d-b9t26\" (UID: \"e2b2890b-92ed-42cc-b58b-8e04f86efd43\") " pod="openshift-monitoring/monitoring-plugin-b4bd4568d-b9t26" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.070965 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-b4bd4568d-b9t26" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.115802 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.119115 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.120827 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.121999 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.126655 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.127216 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.127389 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.127485 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.127658 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.131373 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.131499 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-l5vcp" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.131601 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.131679 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-1s228ud5qktql" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.133151 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.134939 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.139424 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.248357 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cb6681be-0413-4154-87d9-710a23a9c463-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.248391 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb6681be-0413-4154-87d9-710a23a9c463-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.248423 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cb6681be-0413-4154-87d9-710a23a9c463-config-out\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.248450 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb6681be-0413-4154-87d9-710a23a9c463-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.248468 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cb6681be-0413-4154-87d9-710a23a9c463-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.248484 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cb6681be-0413-4154-87d9-710a23a9c463-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.248504 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb6681be-0413-4154-87d9-710a23a9c463-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.248524 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cb6681be-0413-4154-87d9-710a23a9c463-config\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.248539 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cb6681be-0413-4154-87d9-710a23a9c463-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.248580 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cb6681be-0413-4154-87d9-710a23a9c463-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.248601 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg8cs\" (UniqueName: \"kubernetes.io/projected/cb6681be-0413-4154-87d9-710a23a9c463-kube-api-access-vg8cs\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.248678 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cb6681be-0413-4154-87d9-710a23a9c463-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.248717 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cb6681be-0413-4154-87d9-710a23a9c463-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.248754 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cb6681be-0413-4154-87d9-710a23a9c463-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.248881 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cb6681be-0413-4154-87d9-710a23a9c463-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.248949 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cb6681be-0413-4154-87d9-710a23a9c463-web-config\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.248970 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cb6681be-0413-4154-87d9-710a23a9c463-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.249042 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb6681be-0413-4154-87d9-710a23a9c463-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.352063 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cb6681be-0413-4154-87d9-710a23a9c463-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.352252 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cb6681be-0413-4154-87d9-710a23a9c463-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.352295 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cb6681be-0413-4154-87d9-710a23a9c463-web-config\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.352331 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cb6681be-0413-4154-87d9-710a23a9c463-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.352375 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb6681be-0413-4154-87d9-710a23a9c463-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.352409 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cb6681be-0413-4154-87d9-710a23a9c463-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.352426 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb6681be-0413-4154-87d9-710a23a9c463-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.352455 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cb6681be-0413-4154-87d9-710a23a9c463-config-out\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.352473 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb6681be-0413-4154-87d9-710a23a9c463-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.352493 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cb6681be-0413-4154-87d9-710a23a9c463-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.352507 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cb6681be-0413-4154-87d9-710a23a9c463-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.352529 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb6681be-0413-4154-87d9-710a23a9c463-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.352552 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cb6681be-0413-4154-87d9-710a23a9c463-config\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.352568 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cb6681be-0413-4154-87d9-710a23a9c463-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.352614 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cb6681be-0413-4154-87d9-710a23a9c463-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.352637 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg8cs\" (UniqueName: \"kubernetes.io/projected/cb6681be-0413-4154-87d9-710a23a9c463-kube-api-access-vg8cs\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.352657 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cb6681be-0413-4154-87d9-710a23a9c463-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.352671 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cb6681be-0413-4154-87d9-710a23a9c463-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.353676 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb6681be-0413-4154-87d9-710a23a9c463-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.354189 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cb6681be-0413-4154-87d9-710a23a9c463-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.356194 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb6681be-0413-4154-87d9-710a23a9c463-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.356760 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb6681be-0413-4154-87d9-710a23a9c463-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.358517 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cb6681be-0413-4154-87d9-710a23a9c463-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.358894 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cb6681be-0413-4154-87d9-710a23a9c463-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.359033 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cb6681be-0413-4154-87d9-710a23a9c463-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.359124 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb6681be-0413-4154-87d9-710a23a9c463-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.360323 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cb6681be-0413-4154-87d9-710a23a9c463-config\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.362069 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cb6681be-0413-4154-87d9-710a23a9c463-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.362967 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cb6681be-0413-4154-87d9-710a23a9c463-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.364124 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cb6681be-0413-4154-87d9-710a23a9c463-config-out\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.364797 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cb6681be-0413-4154-87d9-710a23a9c463-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.364817 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cb6681be-0413-4154-87d9-710a23a9c463-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.364799 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cb6681be-0413-4154-87d9-710a23a9c463-web-config\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.367246 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cb6681be-0413-4154-87d9-710a23a9c463-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.367765 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg8cs\" (UniqueName: \"kubernetes.io/projected/cb6681be-0413-4154-87d9-710a23a9c463-kube-api-access-vg8cs\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.376709 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cb6681be-0413-4154-87d9-710a23a9c463-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cb6681be-0413-4154-87d9-710a23a9c463\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.443761 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.575336 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5"] Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.645078 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-b4bd4568d-b9t26"] Dec 09 15:04:28 crc kubenswrapper[4555]: W1209 15:04:28.649674 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2b2890b_92ed_42cc_b58b_8e04f86efd43.slice/crio-fb3f34ac6b3628e4ed3bc473849d4c8c5d5f3f1891f1ad5da5ebd438b3b93c75 WatchSource:0}: Error finding container fb3f34ac6b3628e4ed3bc473849d4c8c5d5f3f1891f1ad5da5ebd438b3b93c75: Status 404 returned error can't find the container with id fb3f34ac6b3628e4ed3bc473849d4c8c5d5f3f1891f1ad5da5ebd438b3b93c75 Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.817476 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 09 15:04:28 crc kubenswrapper[4555]: W1209 15:04:28.822734 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb6681be_0413_4154_87d9_710a23a9c463.slice/crio-e46b756c1307bedc1d064100755dd2dca53e35a1e42af226d0a9337bb946eb1c WatchSource:0}: Error finding container e46b756c1307bedc1d064100755dd2dca53e35a1e42af226d0a9337bb946eb1c: Status 404 returned error can't find the container with id e46b756c1307bedc1d064100755dd2dca53e35a1e42af226d0a9337bb946eb1c Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.937922 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cb6681be-0413-4154-87d9-710a23a9c463","Type":"ContainerStarted","Data":"d5d67fb107ed895b01a859bdb0353f9d97c9dff054ffcd514298b8f4dda41499"} Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.938113 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cb6681be-0413-4154-87d9-710a23a9c463","Type":"ContainerStarted","Data":"e46b756c1307bedc1d064100755dd2dca53e35a1e42af226d0a9337bb946eb1c"} Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.938581 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-b4bd4568d-b9t26" event={"ID":"e2b2890b-92ed-42cc-b58b-8e04f86efd43","Type":"ContainerStarted","Data":"fb3f34ac6b3628e4ed3bc473849d4c8c5d5f3f1891f1ad5da5ebd438b3b93c75"} Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.941709 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3c5d1674-62cb-43db-aed4-52d966df5339","Type":"ContainerStarted","Data":"1b562f5de2d673eb1346768f088babbfc662b762fc782b09708ecec384d6ffa7"} Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.941747 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3c5d1674-62cb-43db-aed4-52d966df5339","Type":"ContainerStarted","Data":"aa37b3b33912a0caf996d24d5d87794efc36b2816106d2c8a9b4f2365daa1e2d"} Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.941760 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3c5d1674-62cb-43db-aed4-52d966df5339","Type":"ContainerStarted","Data":"5bbf22129b31f2ca958c39ac5f92f123a5c5f846b7f9faa52b0b503463e9da74"} Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.941768 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3c5d1674-62cb-43db-aed4-52d966df5339","Type":"ContainerStarted","Data":"7e5f936530423e7026aaadbee86ff33f48f4c7d55067172eab635da8de2e2f1d"} Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.941776 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3c5d1674-62cb-43db-aed4-52d966df5339","Type":"ContainerStarted","Data":"fd1e1895d2c28081f9d500f60851b7b4c2819a76834def8649d4ca08e35b4a15"} Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.943035 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" event={"ID":"f2ca5e1b-022a-458a-ac63-7a66f3676329","Type":"ContainerStarted","Data":"16fee91cb657ff53d0589ffb0a1ac11a59368b8eb285182710501b75f40a6966"} Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.946916 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" event={"ID":"a95a9cbd-dcef-4b15-96a0-2efd23287563","Type":"ContainerStarted","Data":"7807483b9f8146fce1b8f5ad806939061ffb0e4ef408e6b1c376d6e6aa1ed582"} Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.946948 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" event={"ID":"a95a9cbd-dcef-4b15-96a0-2efd23287563","Type":"ContainerStarted","Data":"e17a6278c7f7a1a6036858fd27a5ade05bd0e745141f4ecc7e4ace9ed69c0b53"} Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.946960 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" event={"ID":"a95a9cbd-dcef-4b15-96a0-2efd23287563","Type":"ContainerStarted","Data":"788cf4f4c1c83ad65063c7647caec24cd0d2cd14ccd7dcf78364b078c9afceee"} Dec 09 15:04:28 crc kubenswrapper[4555]: I1209 15:04:28.947054 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" Dec 09 15:04:29 crc kubenswrapper[4555]: I1209 15:04:29.004919 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.047223593 podStartE2EDuration="7.004906158s" podCreationTimestamp="2025-12-09 15:04:22 +0000 UTC" firstStartedPulling="2025-12-09 15:04:23.704941108 +0000 UTC m=+373.841653825" lastFinishedPulling="2025-12-09 15:04:27.662623673 +0000 UTC m=+377.799336390" observedRunningTime="2025-12-09 15:04:28.982181961 +0000 UTC m=+379.118894679" watchObservedRunningTime="2025-12-09 15:04:29.004906158 +0000 UTC m=+379.141618877" Dec 09 15:04:29 crc kubenswrapper[4555]: I1209 15:04:29.005202 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" podStartSLOduration=2.371948224 podStartE2EDuration="6.005197569s" podCreationTimestamp="2025-12-09 15:04:23 +0000 UTC" firstStartedPulling="2025-12-09 15:04:24.655013879 +0000 UTC m=+374.791726597" lastFinishedPulling="2025-12-09 15:04:28.288263224 +0000 UTC m=+378.424975942" observedRunningTime="2025-12-09 15:04:29.00505509 +0000 UTC m=+379.141767808" watchObservedRunningTime="2025-12-09 15:04:29.005197569 +0000 UTC m=+379.141910288" Dec 09 15:04:29 crc kubenswrapper[4555]: I1209 15:04:29.953824 4555 generic.go:334] "Generic (PLEG): container finished" podID="cb6681be-0413-4154-87d9-710a23a9c463" containerID="d5d67fb107ed895b01a859bdb0353f9d97c9dff054ffcd514298b8f4dda41499" exitCode=0 Dec 09 15:04:29 crc kubenswrapper[4555]: I1209 15:04:29.953928 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cb6681be-0413-4154-87d9-710a23a9c463","Type":"ContainerDied","Data":"d5d67fb107ed895b01a859bdb0353f9d97c9dff054ffcd514298b8f4dda41499"} Dec 09 15:04:30 crc kubenswrapper[4555]: I1209 15:04:30.964719 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-b4bd4568d-b9t26" event={"ID":"e2b2890b-92ed-42cc-b58b-8e04f86efd43","Type":"ContainerStarted","Data":"6e00e14dec1a3cb313d31beea3189729e7d41f4648f0c6d25a517725cb3e4732"} Dec 09 15:04:30 crc kubenswrapper[4555]: I1209 15:04:30.964977 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-b4bd4568d-b9t26" Dec 09 15:04:30 crc kubenswrapper[4555]: I1209 15:04:30.967997 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" event={"ID":"f2ca5e1b-022a-458a-ac63-7a66f3676329","Type":"ContainerStarted","Data":"53827dd043afa40edbb1ca887900370c0b50b660b0cd04fce386ace2db1b5ba1"} Dec 09 15:04:30 crc kubenswrapper[4555]: I1209 15:04:30.970727 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-b4bd4568d-b9t26" Dec 09 15:04:30 crc kubenswrapper[4555]: I1209 15:04:30.988471 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-b4bd4568d-b9t26" podStartSLOduration=2.289692735 podStartE2EDuration="3.988458855s" podCreationTimestamp="2025-12-09 15:04:27 +0000 UTC" firstStartedPulling="2025-12-09 15:04:28.651931069 +0000 UTC m=+378.788643787" lastFinishedPulling="2025-12-09 15:04:30.350697189 +0000 UTC m=+380.487409907" observedRunningTime="2025-12-09 15:04:30.983960994 +0000 UTC m=+381.120673712" watchObservedRunningTime="2025-12-09 15:04:30.988458855 +0000 UTC m=+381.125171573" Dec 09 15:04:31 crc kubenswrapper[4555]: I1209 15:04:31.009452 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" podStartSLOduration=2.244898766 podStartE2EDuration="4.009439241s" podCreationTimestamp="2025-12-09 15:04:27 +0000 UTC" firstStartedPulling="2025-12-09 15:04:28.583075247 +0000 UTC m=+378.719787965" lastFinishedPulling="2025-12-09 15:04:30.347615722 +0000 UTC m=+380.484328440" observedRunningTime="2025-12-09 15:04:31.008974814 +0000 UTC m=+381.145687532" watchObservedRunningTime="2025-12-09 15:04:31.009439241 +0000 UTC m=+381.146151959" Dec 09 15:04:31 crc kubenswrapper[4555]: I1209 15:04:31.782524 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-xfw89" Dec 09 15:04:31 crc kubenswrapper[4555]: I1209 15:04:31.815375 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wd6tn"] Dec 09 15:04:32 crc kubenswrapper[4555]: I1209 15:04:32.985626 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cb6681be-0413-4154-87d9-710a23a9c463","Type":"ContainerStarted","Data":"2edadc9510f0b665b698a98e54a8f29139a52d284b4ee9ec697148f08aa56b37"} Dec 09 15:04:32 crc kubenswrapper[4555]: I1209 15:04:32.986003 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cb6681be-0413-4154-87d9-710a23a9c463","Type":"ContainerStarted","Data":"b5d29ede52c96c4b0d70c7ed8f8d60704811fbe0682116b1cff2e74ee4b4b0bd"} Dec 09 15:04:32 crc kubenswrapper[4555]: I1209 15:04:32.986017 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cb6681be-0413-4154-87d9-710a23a9c463","Type":"ContainerStarted","Data":"0e355b47dbb93306fb6bc32b23e1fb20db7fe686ec283cb4f85e0bd97c596e88"} Dec 09 15:04:33 crc kubenswrapper[4555]: I1209 15:04:33.992435 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cb6681be-0413-4154-87d9-710a23a9c463","Type":"ContainerStarted","Data":"d8ca5cca3c0d5985d58a28fd11e37d4e7a7cac6b2365e5972f6188650ff8275a"} Dec 09 15:04:33 crc kubenswrapper[4555]: I1209 15:04:33.992682 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cb6681be-0413-4154-87d9-710a23a9c463","Type":"ContainerStarted","Data":"3b1d67e74123063de2bdb7a0a4b34e589276cfe9fa0202a6aa8fabf0b1c14d49"} Dec 09 15:04:33 crc kubenswrapper[4555]: I1209 15:04:33.992696 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cb6681be-0413-4154-87d9-710a23a9c463","Type":"ContainerStarted","Data":"d88d1b1f7fd8f725580065f1808a34de7821c9c6270bcefac1d82944e4babc90"} Dec 09 15:04:34 crc kubenswrapper[4555]: I1209 15:04:34.014865 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.300611593 podStartE2EDuration="6.014849953s" podCreationTimestamp="2025-12-09 15:04:28 +0000 UTC" firstStartedPulling="2025-12-09 15:04:29.955666309 +0000 UTC m=+380.092379027" lastFinishedPulling="2025-12-09 15:04:32.669904669 +0000 UTC m=+382.806617387" observedRunningTime="2025-12-09 15:04:34.011833808 +0000 UTC m=+384.148546516" watchObservedRunningTime="2025-12-09 15:04:34.014849953 +0000 UTC m=+384.151562660" Dec 09 15:04:34 crc kubenswrapper[4555]: I1209 15:04:34.247256 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-868cc685f5-brvwb" Dec 09 15:04:37 crc kubenswrapper[4555]: I1209 15:04:37.124032 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c7f86c659-l8nlg" Dec 09 15:04:37 crc kubenswrapper[4555]: I1209 15:04:37.124268 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-c7f86c659-l8nlg" Dec 09 15:04:37 crc kubenswrapper[4555]: I1209 15:04:37.127766 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c7f86c659-l8nlg" Dec 09 15:04:38 crc kubenswrapper[4555]: I1209 15:04:38.017471 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c7f86c659-l8nlg" Dec 09 15:04:38 crc kubenswrapper[4555]: I1209 15:04:38.047778 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-vhncm"] Dec 09 15:04:38 crc kubenswrapper[4555]: I1209 15:04:38.444620 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:04:47 crc kubenswrapper[4555]: I1209 15:04:47.668476 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" Dec 09 15:04:47 crc kubenswrapper[4555]: I1209 15:04:47.668827 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" Dec 09 15:04:55 crc kubenswrapper[4555]: I1209 15:04:55.531410 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:04:55 crc kubenswrapper[4555]: I1209 15:04:55.531805 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:04:55 crc kubenswrapper[4555]: I1209 15:04:55.531848 4555 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" Dec 09 15:04:55 crc kubenswrapper[4555]: I1209 15:04:55.532273 4555 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"180dd851cd6ac7ce99763c65c8a9c37630bce1147ad5777bcc34e7fd32e66125"} pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 15:04:55 crc kubenswrapper[4555]: I1209 15:04:55.532341 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" containerID="cri-o://180dd851cd6ac7ce99763c65c8a9c37630bce1147ad5777bcc34e7fd32e66125" gracePeriod=600 Dec 09 15:04:56 crc kubenswrapper[4555]: I1209 15:04:56.089519 4555 generic.go:334] "Generic (PLEG): container finished" podID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerID="180dd851cd6ac7ce99763c65c8a9c37630bce1147ad5777bcc34e7fd32e66125" exitCode=0 Dec 09 15:04:56 crc kubenswrapper[4555]: I1209 15:04:56.089601 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" event={"ID":"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36","Type":"ContainerDied","Data":"180dd851cd6ac7ce99763c65c8a9c37630bce1147ad5777bcc34e7fd32e66125"} Dec 09 15:04:56 crc kubenswrapper[4555]: I1209 15:04:56.089696 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" event={"ID":"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36","Type":"ContainerStarted","Data":"8dd058b28813c68e71df78106624226378245df5fe593a53aa3d62ba3999ec86"} Dec 09 15:04:56 crc kubenswrapper[4555]: I1209 15:04:56.089715 4555 scope.go:117] "RemoveContainer" containerID="0b220eef2e4180fd65d4a5381999b91753bb7322a9ab7101edf783cb800715fa" Dec 09 15:04:56 crc kubenswrapper[4555]: I1209 15:04:56.846739 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" podUID="eee41022-2a9c-41c8-bbc8-ff42874c4050" containerName="registry" containerID="cri-o://0ee304509ba0fda875c576dca9bb9b28ecb72ef36c79d40ef8e10259157efcc8" gracePeriod=30 Dec 09 15:04:57 crc kubenswrapper[4555]: I1209 15:04:57.094966 4555 generic.go:334] "Generic (PLEG): container finished" podID="eee41022-2a9c-41c8-bbc8-ff42874c4050" containerID="0ee304509ba0fda875c576dca9bb9b28ecb72ef36c79d40ef8e10259157efcc8" exitCode=0 Dec 09 15:04:57 crc kubenswrapper[4555]: I1209 15:04:57.095238 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" event={"ID":"eee41022-2a9c-41c8-bbc8-ff42874c4050","Type":"ContainerDied","Data":"0ee304509ba0fda875c576dca9bb9b28ecb72ef36c79d40ef8e10259157efcc8"} Dec 09 15:04:57 crc kubenswrapper[4555]: I1209 15:04:57.608764 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:04:57 crc kubenswrapper[4555]: I1209 15:04:57.731589 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eee41022-2a9c-41c8-bbc8-ff42874c4050-registry-tls\") pod \"eee41022-2a9c-41c8-bbc8-ff42874c4050\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " Dec 09 15:04:57 crc kubenswrapper[4555]: I1209 15:04:57.731643 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eee41022-2a9c-41c8-bbc8-ff42874c4050-ca-trust-extracted\") pod \"eee41022-2a9c-41c8-bbc8-ff42874c4050\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " Dec 09 15:04:57 crc kubenswrapper[4555]: I1209 15:04:57.731762 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"eee41022-2a9c-41c8-bbc8-ff42874c4050\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " Dec 09 15:04:57 crc kubenswrapper[4555]: I1209 15:04:57.731815 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eee41022-2a9c-41c8-bbc8-ff42874c4050-bound-sa-token\") pod \"eee41022-2a9c-41c8-bbc8-ff42874c4050\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " Dec 09 15:04:57 crc kubenswrapper[4555]: I1209 15:04:57.731855 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eee41022-2a9c-41c8-bbc8-ff42874c4050-installation-pull-secrets\") pod \"eee41022-2a9c-41c8-bbc8-ff42874c4050\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " Dec 09 15:04:57 crc kubenswrapper[4555]: I1209 15:04:57.731876 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dn2kc\" (UniqueName: \"kubernetes.io/projected/eee41022-2a9c-41c8-bbc8-ff42874c4050-kube-api-access-dn2kc\") pod \"eee41022-2a9c-41c8-bbc8-ff42874c4050\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " Dec 09 15:04:57 crc kubenswrapper[4555]: I1209 15:04:57.732492 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eee41022-2a9c-41c8-bbc8-ff42874c4050-registry-certificates\") pod \"eee41022-2a9c-41c8-bbc8-ff42874c4050\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " Dec 09 15:04:57 crc kubenswrapper[4555]: I1209 15:04:57.732529 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eee41022-2a9c-41c8-bbc8-ff42874c4050-trusted-ca\") pod \"eee41022-2a9c-41c8-bbc8-ff42874c4050\" (UID: \"eee41022-2a9c-41c8-bbc8-ff42874c4050\") " Dec 09 15:04:57 crc kubenswrapper[4555]: I1209 15:04:57.733109 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eee41022-2a9c-41c8-bbc8-ff42874c4050-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "eee41022-2a9c-41c8-bbc8-ff42874c4050" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:04:57 crc kubenswrapper[4555]: I1209 15:04:57.733184 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eee41022-2a9c-41c8-bbc8-ff42874c4050-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "eee41022-2a9c-41c8-bbc8-ff42874c4050" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:04:57 crc kubenswrapper[4555]: I1209 15:04:57.736799 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eee41022-2a9c-41c8-bbc8-ff42874c4050-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "eee41022-2a9c-41c8-bbc8-ff42874c4050" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:04:57 crc kubenswrapper[4555]: I1209 15:04:57.736790 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eee41022-2a9c-41c8-bbc8-ff42874c4050-kube-api-access-dn2kc" (OuterVolumeSpecName: "kube-api-access-dn2kc") pod "eee41022-2a9c-41c8-bbc8-ff42874c4050" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050"). InnerVolumeSpecName "kube-api-access-dn2kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:04:57 crc kubenswrapper[4555]: I1209 15:04:57.737359 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eee41022-2a9c-41c8-bbc8-ff42874c4050-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "eee41022-2a9c-41c8-bbc8-ff42874c4050" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:04:57 crc kubenswrapper[4555]: I1209 15:04:57.737641 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eee41022-2a9c-41c8-bbc8-ff42874c4050-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "eee41022-2a9c-41c8-bbc8-ff42874c4050" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:04:57 crc kubenswrapper[4555]: I1209 15:04:57.738901 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "eee41022-2a9c-41c8-bbc8-ff42874c4050" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 09 15:04:57 crc kubenswrapper[4555]: I1209 15:04:57.746769 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eee41022-2a9c-41c8-bbc8-ff42874c4050-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "eee41022-2a9c-41c8-bbc8-ff42874c4050" (UID: "eee41022-2a9c-41c8-bbc8-ff42874c4050"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:04:57 crc kubenswrapper[4555]: I1209 15:04:57.833622 4555 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eee41022-2a9c-41c8-bbc8-ff42874c4050-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 09 15:04:57 crc kubenswrapper[4555]: I1209 15:04:57.833643 4555 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eee41022-2a9c-41c8-bbc8-ff42874c4050-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 15:04:57 crc kubenswrapper[4555]: I1209 15:04:57.833652 4555 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eee41022-2a9c-41c8-bbc8-ff42874c4050-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 09 15:04:57 crc kubenswrapper[4555]: I1209 15:04:57.833661 4555 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eee41022-2a9c-41c8-bbc8-ff42874c4050-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 09 15:04:57 crc kubenswrapper[4555]: I1209 15:04:57.833669 4555 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eee41022-2a9c-41c8-bbc8-ff42874c4050-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 09 15:04:57 crc kubenswrapper[4555]: I1209 15:04:57.833677 4555 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eee41022-2a9c-41c8-bbc8-ff42874c4050-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 09 15:04:57 crc kubenswrapper[4555]: I1209 15:04:57.833684 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dn2kc\" (UniqueName: \"kubernetes.io/projected/eee41022-2a9c-41c8-bbc8-ff42874c4050-kube-api-access-dn2kc\") on node \"crc\" DevicePath \"\"" Dec 09 15:04:58 crc kubenswrapper[4555]: I1209 15:04:58.102476 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" event={"ID":"eee41022-2a9c-41c8-bbc8-ff42874c4050","Type":"ContainerDied","Data":"7ed3ad04539c4041ff4a4e69a1a26dc71100e7c02fe83bef54acdffc8de5e5ae"} Dec 09 15:04:58 crc kubenswrapper[4555]: I1209 15:04:58.102523 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wd6tn" Dec 09 15:04:58 crc kubenswrapper[4555]: I1209 15:04:58.103124 4555 scope.go:117] "RemoveContainer" containerID="0ee304509ba0fda875c576dca9bb9b28ecb72ef36c79d40ef8e10259157efcc8" Dec 09 15:04:58 crc kubenswrapper[4555]: I1209 15:04:58.123853 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wd6tn"] Dec 09 15:04:58 crc kubenswrapper[4555]: I1209 15:04:58.127092 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wd6tn"] Dec 09 15:04:58 crc kubenswrapper[4555]: I1209 15:04:58.167408 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eee41022-2a9c-41c8-bbc8-ff42874c4050" path="/var/lib/kubelet/pods/eee41022-2a9c-41c8-bbc8-ff42874c4050/volumes" Dec 09 15:05:03 crc kubenswrapper[4555]: I1209 15:05:03.072109 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-vhncm" podUID="a7fc2be6-0cb9-4fad-ac58-37004264a563" containerName="console" containerID="cri-o://e0e79c2f0a37b71ee10a288e3c2f7510f1b093c170102011ea999c6d4c0198b1" gracePeriod=15 Dec 09 15:05:03 crc kubenswrapper[4555]: I1209 15:05:03.362647 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-vhncm_a7fc2be6-0cb9-4fad-ac58-37004264a563/console/0.log" Dec 09 15:05:03 crc kubenswrapper[4555]: I1209 15:05:03.362829 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vhncm" Dec 09 15:05:03 crc kubenswrapper[4555]: I1209 15:05:03.494540 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a7fc2be6-0cb9-4fad-ac58-37004264a563-oauth-serving-cert\") pod \"a7fc2be6-0cb9-4fad-ac58-37004264a563\" (UID: \"a7fc2be6-0cb9-4fad-ac58-37004264a563\") " Dec 09 15:05:03 crc kubenswrapper[4555]: I1209 15:05:03.494611 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a7fc2be6-0cb9-4fad-ac58-37004264a563-console-config\") pod \"a7fc2be6-0cb9-4fad-ac58-37004264a563\" (UID: \"a7fc2be6-0cb9-4fad-ac58-37004264a563\") " Dec 09 15:05:03 crc kubenswrapper[4555]: I1209 15:05:03.494661 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns6z2\" (UniqueName: \"kubernetes.io/projected/a7fc2be6-0cb9-4fad-ac58-37004264a563-kube-api-access-ns6z2\") pod \"a7fc2be6-0cb9-4fad-ac58-37004264a563\" (UID: \"a7fc2be6-0cb9-4fad-ac58-37004264a563\") " Dec 09 15:05:03 crc kubenswrapper[4555]: I1209 15:05:03.494720 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7fc2be6-0cb9-4fad-ac58-37004264a563-trusted-ca-bundle\") pod \"a7fc2be6-0cb9-4fad-ac58-37004264a563\" (UID: \"a7fc2be6-0cb9-4fad-ac58-37004264a563\") " Dec 09 15:05:03 crc kubenswrapper[4555]: I1209 15:05:03.494756 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a7fc2be6-0cb9-4fad-ac58-37004264a563-service-ca\") pod \"a7fc2be6-0cb9-4fad-ac58-37004264a563\" (UID: \"a7fc2be6-0cb9-4fad-ac58-37004264a563\") " Dec 09 15:05:03 crc kubenswrapper[4555]: I1209 15:05:03.494788 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7fc2be6-0cb9-4fad-ac58-37004264a563-console-serving-cert\") pod \"a7fc2be6-0cb9-4fad-ac58-37004264a563\" (UID: \"a7fc2be6-0cb9-4fad-ac58-37004264a563\") " Dec 09 15:05:03 crc kubenswrapper[4555]: I1209 15:05:03.494803 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a7fc2be6-0cb9-4fad-ac58-37004264a563-console-oauth-config\") pod \"a7fc2be6-0cb9-4fad-ac58-37004264a563\" (UID: \"a7fc2be6-0cb9-4fad-ac58-37004264a563\") " Dec 09 15:05:03 crc kubenswrapper[4555]: I1209 15:05:03.495414 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7fc2be6-0cb9-4fad-ac58-37004264a563-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a7fc2be6-0cb9-4fad-ac58-37004264a563" (UID: "a7fc2be6-0cb9-4fad-ac58-37004264a563"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:05:03 crc kubenswrapper[4555]: I1209 15:05:03.495460 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7fc2be6-0cb9-4fad-ac58-37004264a563-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a7fc2be6-0cb9-4fad-ac58-37004264a563" (UID: "a7fc2be6-0cb9-4fad-ac58-37004264a563"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:05:03 crc kubenswrapper[4555]: I1209 15:05:03.495489 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7fc2be6-0cb9-4fad-ac58-37004264a563-service-ca" (OuterVolumeSpecName: "service-ca") pod "a7fc2be6-0cb9-4fad-ac58-37004264a563" (UID: "a7fc2be6-0cb9-4fad-ac58-37004264a563"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:05:03 crc kubenswrapper[4555]: I1209 15:05:03.495778 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7fc2be6-0cb9-4fad-ac58-37004264a563-console-config" (OuterVolumeSpecName: "console-config") pod "a7fc2be6-0cb9-4fad-ac58-37004264a563" (UID: "a7fc2be6-0cb9-4fad-ac58-37004264a563"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:05:03 crc kubenswrapper[4555]: I1209 15:05:03.498826 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7fc2be6-0cb9-4fad-ac58-37004264a563-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a7fc2be6-0cb9-4fad-ac58-37004264a563" (UID: "a7fc2be6-0cb9-4fad-ac58-37004264a563"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:05:03 crc kubenswrapper[4555]: I1209 15:05:03.499032 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7fc2be6-0cb9-4fad-ac58-37004264a563-kube-api-access-ns6z2" (OuterVolumeSpecName: "kube-api-access-ns6z2") pod "a7fc2be6-0cb9-4fad-ac58-37004264a563" (UID: "a7fc2be6-0cb9-4fad-ac58-37004264a563"). InnerVolumeSpecName "kube-api-access-ns6z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:05:03 crc kubenswrapper[4555]: I1209 15:05:03.499047 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7fc2be6-0cb9-4fad-ac58-37004264a563-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a7fc2be6-0cb9-4fad-ac58-37004264a563" (UID: "a7fc2be6-0cb9-4fad-ac58-37004264a563"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:05:03 crc kubenswrapper[4555]: I1209 15:05:03.596276 4555 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7fc2be6-0cb9-4fad-ac58-37004264a563-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 15:05:03 crc kubenswrapper[4555]: I1209 15:05:03.596314 4555 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a7fc2be6-0cb9-4fad-ac58-37004264a563-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 09 15:05:03 crc kubenswrapper[4555]: I1209 15:05:03.596324 4555 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a7fc2be6-0cb9-4fad-ac58-37004264a563-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 15:05:03 crc kubenswrapper[4555]: I1209 15:05:03.596333 4555 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a7fc2be6-0cb9-4fad-ac58-37004264a563-console-config\") on node \"crc\" DevicePath \"\"" Dec 09 15:05:03 crc kubenswrapper[4555]: I1209 15:05:03.596342 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns6z2\" (UniqueName: \"kubernetes.io/projected/a7fc2be6-0cb9-4fad-ac58-37004264a563-kube-api-access-ns6z2\") on node \"crc\" DevicePath \"\"" Dec 09 15:05:03 crc kubenswrapper[4555]: I1209 15:05:03.596351 4555 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7fc2be6-0cb9-4fad-ac58-37004264a563-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 15:05:03 crc kubenswrapper[4555]: I1209 15:05:03.596358 4555 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a7fc2be6-0cb9-4fad-ac58-37004264a563-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 15:05:04 crc kubenswrapper[4555]: I1209 15:05:04.130815 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-vhncm_a7fc2be6-0cb9-4fad-ac58-37004264a563/console/0.log" Dec 09 15:05:04 crc kubenswrapper[4555]: I1209 15:05:04.130983 4555 generic.go:334] "Generic (PLEG): container finished" podID="a7fc2be6-0cb9-4fad-ac58-37004264a563" containerID="e0e79c2f0a37b71ee10a288e3c2f7510f1b093c170102011ea999c6d4c0198b1" exitCode=2 Dec 09 15:05:04 crc kubenswrapper[4555]: I1209 15:05:04.131006 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vhncm" event={"ID":"a7fc2be6-0cb9-4fad-ac58-37004264a563","Type":"ContainerDied","Data":"e0e79c2f0a37b71ee10a288e3c2f7510f1b093c170102011ea999c6d4c0198b1"} Dec 09 15:05:04 crc kubenswrapper[4555]: I1209 15:05:04.131028 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vhncm" event={"ID":"a7fc2be6-0cb9-4fad-ac58-37004264a563","Type":"ContainerDied","Data":"112374ee17f88d33cbf08266528bee8cbb4629d6ff74862e0fa4b05b0d3c27ae"} Dec 09 15:05:04 crc kubenswrapper[4555]: I1209 15:05:04.131042 4555 scope.go:117] "RemoveContainer" containerID="e0e79c2f0a37b71ee10a288e3c2f7510f1b093c170102011ea999c6d4c0198b1" Dec 09 15:05:04 crc kubenswrapper[4555]: I1209 15:05:04.131128 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vhncm" Dec 09 15:05:04 crc kubenswrapper[4555]: I1209 15:05:04.142546 4555 scope.go:117] "RemoveContainer" containerID="e0e79c2f0a37b71ee10a288e3c2f7510f1b093c170102011ea999c6d4c0198b1" Dec 09 15:05:04 crc kubenswrapper[4555]: E1209 15:05:04.142888 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0e79c2f0a37b71ee10a288e3c2f7510f1b093c170102011ea999c6d4c0198b1\": container with ID starting with e0e79c2f0a37b71ee10a288e3c2f7510f1b093c170102011ea999c6d4c0198b1 not found: ID does not exist" containerID="e0e79c2f0a37b71ee10a288e3c2f7510f1b093c170102011ea999c6d4c0198b1" Dec 09 15:05:04 crc kubenswrapper[4555]: I1209 15:05:04.142930 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0e79c2f0a37b71ee10a288e3c2f7510f1b093c170102011ea999c6d4c0198b1"} err="failed to get container status \"e0e79c2f0a37b71ee10a288e3c2f7510f1b093c170102011ea999c6d4c0198b1\": rpc error: code = NotFound desc = could not find container \"e0e79c2f0a37b71ee10a288e3c2f7510f1b093c170102011ea999c6d4c0198b1\": container with ID starting with e0e79c2f0a37b71ee10a288e3c2f7510f1b093c170102011ea999c6d4c0198b1 not found: ID does not exist" Dec 09 15:05:04 crc kubenswrapper[4555]: I1209 15:05:04.152421 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-vhncm"] Dec 09 15:05:04 crc kubenswrapper[4555]: I1209 15:05:04.155653 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-vhncm"] Dec 09 15:05:04 crc kubenswrapper[4555]: I1209 15:05:04.166634 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7fc2be6-0cb9-4fad-ac58-37004264a563" path="/var/lib/kubelet/pods/a7fc2be6-0cb9-4fad-ac58-37004264a563/volumes" Dec 09 15:05:07 crc kubenswrapper[4555]: I1209 15:05:07.673713 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" Dec 09 15:05:07 crc kubenswrapper[4555]: I1209 15:05:07.677022 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7c88b9f8fd-2lsx5" Dec 09 15:05:28 crc kubenswrapper[4555]: I1209 15:05:28.445251 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:05:28 crc kubenswrapper[4555]: I1209 15:05:28.483962 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:05:29 crc kubenswrapper[4555]: I1209 15:05:29.253596 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.299597 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-58c6fb9d58-n5r2c"] Dec 09 15:05:39 crc kubenswrapper[4555]: E1209 15:05:39.301743 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7fc2be6-0cb9-4fad-ac58-37004264a563" containerName="console" Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.301827 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7fc2be6-0cb9-4fad-ac58-37004264a563" containerName="console" Dec 09 15:05:39 crc kubenswrapper[4555]: E1209 15:05:39.301881 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee41022-2a9c-41c8-bbc8-ff42874c4050" containerName="registry" Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.301923 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee41022-2a9c-41c8-bbc8-ff42874c4050" containerName="registry" Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.302138 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="eee41022-2a9c-41c8-bbc8-ff42874c4050" containerName="registry" Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.302205 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7fc2be6-0cb9-4fad-ac58-37004264a563" containerName="console" Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.302716 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58c6fb9d58-n5r2c" Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.353214 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58c6fb9d58-n5r2c"] Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.374907 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e80fda3a-44a2-4671-899a-b041186264b5-service-ca\") pod \"console-58c6fb9d58-n5r2c\" (UID: \"e80fda3a-44a2-4671-899a-b041186264b5\") " pod="openshift-console/console-58c6fb9d58-n5r2c" Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.374974 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e80fda3a-44a2-4671-899a-b041186264b5-oauth-serving-cert\") pod \"console-58c6fb9d58-n5r2c\" (UID: \"e80fda3a-44a2-4671-899a-b041186264b5\") " pod="openshift-console/console-58c6fb9d58-n5r2c" Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.374993 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6bft\" (UniqueName: \"kubernetes.io/projected/e80fda3a-44a2-4671-899a-b041186264b5-kube-api-access-c6bft\") pod \"console-58c6fb9d58-n5r2c\" (UID: \"e80fda3a-44a2-4671-899a-b041186264b5\") " pod="openshift-console/console-58c6fb9d58-n5r2c" Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.375032 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e80fda3a-44a2-4671-899a-b041186264b5-console-serving-cert\") pod \"console-58c6fb9d58-n5r2c\" (UID: \"e80fda3a-44a2-4671-899a-b041186264b5\") " pod="openshift-console/console-58c6fb9d58-n5r2c" Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.375058 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e80fda3a-44a2-4671-899a-b041186264b5-console-config\") pod \"console-58c6fb9d58-n5r2c\" (UID: \"e80fda3a-44a2-4671-899a-b041186264b5\") " pod="openshift-console/console-58c6fb9d58-n5r2c" Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.375089 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e80fda3a-44a2-4671-899a-b041186264b5-console-oauth-config\") pod \"console-58c6fb9d58-n5r2c\" (UID: \"e80fda3a-44a2-4671-899a-b041186264b5\") " pod="openshift-console/console-58c6fb9d58-n5r2c" Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.375107 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e80fda3a-44a2-4671-899a-b041186264b5-trusted-ca-bundle\") pod \"console-58c6fb9d58-n5r2c\" (UID: \"e80fda3a-44a2-4671-899a-b041186264b5\") " pod="openshift-console/console-58c6fb9d58-n5r2c" Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.476522 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e80fda3a-44a2-4671-899a-b041186264b5-oauth-serving-cert\") pod \"console-58c6fb9d58-n5r2c\" (UID: \"e80fda3a-44a2-4671-899a-b041186264b5\") " pod="openshift-console/console-58c6fb9d58-n5r2c" Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.476677 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6bft\" (UniqueName: \"kubernetes.io/projected/e80fda3a-44a2-4671-899a-b041186264b5-kube-api-access-c6bft\") pod \"console-58c6fb9d58-n5r2c\" (UID: \"e80fda3a-44a2-4671-899a-b041186264b5\") " pod="openshift-console/console-58c6fb9d58-n5r2c" Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.476809 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e80fda3a-44a2-4671-899a-b041186264b5-console-serving-cert\") pod \"console-58c6fb9d58-n5r2c\" (UID: \"e80fda3a-44a2-4671-899a-b041186264b5\") " pod="openshift-console/console-58c6fb9d58-n5r2c" Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.476903 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e80fda3a-44a2-4671-899a-b041186264b5-console-config\") pod \"console-58c6fb9d58-n5r2c\" (UID: \"e80fda3a-44a2-4671-899a-b041186264b5\") " pod="openshift-console/console-58c6fb9d58-n5r2c" Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.477035 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e80fda3a-44a2-4671-899a-b041186264b5-console-oauth-config\") pod \"console-58c6fb9d58-n5r2c\" (UID: \"e80fda3a-44a2-4671-899a-b041186264b5\") " pod="openshift-console/console-58c6fb9d58-n5r2c" Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.477119 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e80fda3a-44a2-4671-899a-b041186264b5-trusted-ca-bundle\") pod \"console-58c6fb9d58-n5r2c\" (UID: \"e80fda3a-44a2-4671-899a-b041186264b5\") " pod="openshift-console/console-58c6fb9d58-n5r2c" Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.477324 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e80fda3a-44a2-4671-899a-b041186264b5-service-ca\") pod \"console-58c6fb9d58-n5r2c\" (UID: \"e80fda3a-44a2-4671-899a-b041186264b5\") " pod="openshift-console/console-58c6fb9d58-n5r2c" Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.477726 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e80fda3a-44a2-4671-899a-b041186264b5-console-config\") pod \"console-58c6fb9d58-n5r2c\" (UID: \"e80fda3a-44a2-4671-899a-b041186264b5\") " pod="openshift-console/console-58c6fb9d58-n5r2c" Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.477874 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e80fda3a-44a2-4671-899a-b041186264b5-trusted-ca-bundle\") pod \"console-58c6fb9d58-n5r2c\" (UID: \"e80fda3a-44a2-4671-899a-b041186264b5\") " pod="openshift-console/console-58c6fb9d58-n5r2c" Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.478116 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e80fda3a-44a2-4671-899a-b041186264b5-oauth-serving-cert\") pod \"console-58c6fb9d58-n5r2c\" (UID: \"e80fda3a-44a2-4671-899a-b041186264b5\") " pod="openshift-console/console-58c6fb9d58-n5r2c" Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.478290 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e80fda3a-44a2-4671-899a-b041186264b5-service-ca\") pod \"console-58c6fb9d58-n5r2c\" (UID: \"e80fda3a-44a2-4671-899a-b041186264b5\") " pod="openshift-console/console-58c6fb9d58-n5r2c" Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.482081 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e80fda3a-44a2-4671-899a-b041186264b5-console-serving-cert\") pod \"console-58c6fb9d58-n5r2c\" (UID: \"e80fda3a-44a2-4671-899a-b041186264b5\") " pod="openshift-console/console-58c6fb9d58-n5r2c" Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.482096 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e80fda3a-44a2-4671-899a-b041186264b5-console-oauth-config\") pod \"console-58c6fb9d58-n5r2c\" (UID: \"e80fda3a-44a2-4671-899a-b041186264b5\") " pod="openshift-console/console-58c6fb9d58-n5r2c" Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.492777 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6bft\" (UniqueName: \"kubernetes.io/projected/e80fda3a-44a2-4671-899a-b041186264b5-kube-api-access-c6bft\") pod \"console-58c6fb9d58-n5r2c\" (UID: \"e80fda3a-44a2-4671-899a-b041186264b5\") " pod="openshift-console/console-58c6fb9d58-n5r2c" Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.618822 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58c6fb9d58-n5r2c" Dec 09 15:05:39 crc kubenswrapper[4555]: I1209 15:05:39.775811 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58c6fb9d58-n5r2c"] Dec 09 15:05:40 crc kubenswrapper[4555]: I1209 15:05:40.279371 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58c6fb9d58-n5r2c" event={"ID":"e80fda3a-44a2-4671-899a-b041186264b5","Type":"ContainerStarted","Data":"022f771bbe19f1266d06ea313b354900c231527e2cb369b81f99e19878b29f6f"} Dec 09 15:05:40 crc kubenswrapper[4555]: I1209 15:05:40.279685 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58c6fb9d58-n5r2c" event={"ID":"e80fda3a-44a2-4671-899a-b041186264b5","Type":"ContainerStarted","Data":"8695f24c55619ab8e1b1b5797b4e7956d4210ecf4fd3928b4c9d8ed021f27072"} Dec 09 15:05:40 crc kubenswrapper[4555]: I1209 15:05:40.299158 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-58c6fb9d58-n5r2c" podStartSLOduration=1.2991421619999999 podStartE2EDuration="1.299142162s" podCreationTimestamp="2025-12-09 15:05:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:05:40.29791653 +0000 UTC m=+450.434629249" watchObservedRunningTime="2025-12-09 15:05:40.299142162 +0000 UTC m=+450.435854870" Dec 09 15:05:49 crc kubenswrapper[4555]: I1209 15:05:49.619252 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-58c6fb9d58-n5r2c" Dec 09 15:05:49 crc kubenswrapper[4555]: I1209 15:05:49.619884 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-58c6fb9d58-n5r2c" Dec 09 15:05:49 crc kubenswrapper[4555]: I1209 15:05:49.623449 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-58c6fb9d58-n5r2c" Dec 09 15:05:50 crc kubenswrapper[4555]: I1209 15:05:50.322865 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-58c6fb9d58-n5r2c" Dec 09 15:05:50 crc kubenswrapper[4555]: I1209 15:05:50.367048 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c7f86c659-l8nlg"] Dec 09 15:06:15 crc kubenswrapper[4555]: I1209 15:06:15.393803 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-c7f86c659-l8nlg" podUID="eef1d965-488b-465b-bc5b-ce417339682b" containerName="console" containerID="cri-o://1865c132d3fb31fe70531d6c03bf6150c6ae1febaff26ecc903c75d3884e978f" gracePeriod=15 Dec 09 15:06:15 crc kubenswrapper[4555]: I1209 15:06:15.673284 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c7f86c659-l8nlg_eef1d965-488b-465b-bc5b-ce417339682b/console/0.log" Dec 09 15:06:15 crc kubenswrapper[4555]: I1209 15:06:15.673477 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c7f86c659-l8nlg" Dec 09 15:06:15 crc kubenswrapper[4555]: I1209 15:06:15.873638 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eef1d965-488b-465b-bc5b-ce417339682b-console-config\") pod \"eef1d965-488b-465b-bc5b-ce417339682b\" (UID: \"eef1d965-488b-465b-bc5b-ce417339682b\") " Dec 09 15:06:15 crc kubenswrapper[4555]: I1209 15:06:15.873669 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b7d7\" (UniqueName: \"kubernetes.io/projected/eef1d965-488b-465b-bc5b-ce417339682b-kube-api-access-7b7d7\") pod \"eef1d965-488b-465b-bc5b-ce417339682b\" (UID: \"eef1d965-488b-465b-bc5b-ce417339682b\") " Dec 09 15:06:15 crc kubenswrapper[4555]: I1209 15:06:15.873692 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eef1d965-488b-465b-bc5b-ce417339682b-console-oauth-config\") pod \"eef1d965-488b-465b-bc5b-ce417339682b\" (UID: \"eef1d965-488b-465b-bc5b-ce417339682b\") " Dec 09 15:06:15 crc kubenswrapper[4555]: I1209 15:06:15.873708 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eef1d965-488b-465b-bc5b-ce417339682b-console-serving-cert\") pod \"eef1d965-488b-465b-bc5b-ce417339682b\" (UID: \"eef1d965-488b-465b-bc5b-ce417339682b\") " Dec 09 15:06:15 crc kubenswrapper[4555]: I1209 15:06:15.873729 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eef1d965-488b-465b-bc5b-ce417339682b-oauth-serving-cert\") pod \"eef1d965-488b-465b-bc5b-ce417339682b\" (UID: \"eef1d965-488b-465b-bc5b-ce417339682b\") " Dec 09 15:06:15 crc kubenswrapper[4555]: I1209 15:06:15.873808 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eef1d965-488b-465b-bc5b-ce417339682b-trusted-ca-bundle\") pod \"eef1d965-488b-465b-bc5b-ce417339682b\" (UID: \"eef1d965-488b-465b-bc5b-ce417339682b\") " Dec 09 15:06:15 crc kubenswrapper[4555]: I1209 15:06:15.873832 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eef1d965-488b-465b-bc5b-ce417339682b-service-ca\") pod \"eef1d965-488b-465b-bc5b-ce417339682b\" (UID: \"eef1d965-488b-465b-bc5b-ce417339682b\") " Dec 09 15:06:15 crc kubenswrapper[4555]: I1209 15:06:15.874484 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eef1d965-488b-465b-bc5b-ce417339682b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "eef1d965-488b-465b-bc5b-ce417339682b" (UID: "eef1d965-488b-465b-bc5b-ce417339682b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:06:15 crc kubenswrapper[4555]: I1209 15:06:15.874509 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eef1d965-488b-465b-bc5b-ce417339682b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "eef1d965-488b-465b-bc5b-ce417339682b" (UID: "eef1d965-488b-465b-bc5b-ce417339682b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:06:15 crc kubenswrapper[4555]: I1209 15:06:15.874498 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eef1d965-488b-465b-bc5b-ce417339682b-service-ca" (OuterVolumeSpecName: "service-ca") pod "eef1d965-488b-465b-bc5b-ce417339682b" (UID: "eef1d965-488b-465b-bc5b-ce417339682b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:06:15 crc kubenswrapper[4555]: I1209 15:06:15.874689 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eef1d965-488b-465b-bc5b-ce417339682b-console-config" (OuterVolumeSpecName: "console-config") pod "eef1d965-488b-465b-bc5b-ce417339682b" (UID: "eef1d965-488b-465b-bc5b-ce417339682b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:06:15 crc kubenswrapper[4555]: I1209 15:06:15.877657 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef1d965-488b-465b-bc5b-ce417339682b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "eef1d965-488b-465b-bc5b-ce417339682b" (UID: "eef1d965-488b-465b-bc5b-ce417339682b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:06:15 crc kubenswrapper[4555]: I1209 15:06:15.877718 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef1d965-488b-465b-bc5b-ce417339682b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "eef1d965-488b-465b-bc5b-ce417339682b" (UID: "eef1d965-488b-465b-bc5b-ce417339682b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:06:15 crc kubenswrapper[4555]: I1209 15:06:15.877900 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eef1d965-488b-465b-bc5b-ce417339682b-kube-api-access-7b7d7" (OuterVolumeSpecName: "kube-api-access-7b7d7") pod "eef1d965-488b-465b-bc5b-ce417339682b" (UID: "eef1d965-488b-465b-bc5b-ce417339682b"). InnerVolumeSpecName "kube-api-access-7b7d7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:06:15 crc kubenswrapper[4555]: I1209 15:06:15.975020 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b7d7\" (UniqueName: \"kubernetes.io/projected/eef1d965-488b-465b-bc5b-ce417339682b-kube-api-access-7b7d7\") on node \"crc\" DevicePath \"\"" Dec 09 15:06:15 crc kubenswrapper[4555]: I1209 15:06:15.975166 4555 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eef1d965-488b-465b-bc5b-ce417339682b-console-config\") on node \"crc\" DevicePath \"\"" Dec 09 15:06:15 crc kubenswrapper[4555]: I1209 15:06:15.975227 4555 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eef1d965-488b-465b-bc5b-ce417339682b-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 09 15:06:15 crc kubenswrapper[4555]: I1209 15:06:15.975274 4555 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eef1d965-488b-465b-bc5b-ce417339682b-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 15:06:15 crc kubenswrapper[4555]: I1209 15:06:15.975359 4555 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eef1d965-488b-465b-bc5b-ce417339682b-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 15:06:15 crc kubenswrapper[4555]: I1209 15:06:15.975415 4555 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eef1d965-488b-465b-bc5b-ce417339682b-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 15:06:15 crc kubenswrapper[4555]: I1209 15:06:15.975462 4555 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eef1d965-488b-465b-bc5b-ce417339682b-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 15:06:16 crc kubenswrapper[4555]: I1209 15:06:16.422973 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c7f86c659-l8nlg_eef1d965-488b-465b-bc5b-ce417339682b/console/0.log" Dec 09 15:06:16 crc kubenswrapper[4555]: I1209 15:06:16.423023 4555 generic.go:334] "Generic (PLEG): container finished" podID="eef1d965-488b-465b-bc5b-ce417339682b" containerID="1865c132d3fb31fe70531d6c03bf6150c6ae1febaff26ecc903c75d3884e978f" exitCode=2 Dec 09 15:06:16 crc kubenswrapper[4555]: I1209 15:06:16.423051 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c7f86c659-l8nlg" event={"ID":"eef1d965-488b-465b-bc5b-ce417339682b","Type":"ContainerDied","Data":"1865c132d3fb31fe70531d6c03bf6150c6ae1febaff26ecc903c75d3884e978f"} Dec 09 15:06:16 crc kubenswrapper[4555]: I1209 15:06:16.423080 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c7f86c659-l8nlg" event={"ID":"eef1d965-488b-465b-bc5b-ce417339682b","Type":"ContainerDied","Data":"40e5a9ff883fe765e824450a97569f1e385407c6e7c314475c9958758dbb7a8f"} Dec 09 15:06:16 crc kubenswrapper[4555]: I1209 15:06:16.423079 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c7f86c659-l8nlg" Dec 09 15:06:16 crc kubenswrapper[4555]: I1209 15:06:16.423093 4555 scope.go:117] "RemoveContainer" containerID="1865c132d3fb31fe70531d6c03bf6150c6ae1febaff26ecc903c75d3884e978f" Dec 09 15:06:16 crc kubenswrapper[4555]: I1209 15:06:16.434656 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c7f86c659-l8nlg"] Dec 09 15:06:16 crc kubenswrapper[4555]: I1209 15:06:16.435641 4555 scope.go:117] "RemoveContainer" containerID="1865c132d3fb31fe70531d6c03bf6150c6ae1febaff26ecc903c75d3884e978f" Dec 09 15:06:16 crc kubenswrapper[4555]: E1209 15:06:16.436047 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1865c132d3fb31fe70531d6c03bf6150c6ae1febaff26ecc903c75d3884e978f\": container with ID starting with 1865c132d3fb31fe70531d6c03bf6150c6ae1febaff26ecc903c75d3884e978f not found: ID does not exist" containerID="1865c132d3fb31fe70531d6c03bf6150c6ae1febaff26ecc903c75d3884e978f" Dec 09 15:06:16 crc kubenswrapper[4555]: I1209 15:06:16.436161 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1865c132d3fb31fe70531d6c03bf6150c6ae1febaff26ecc903c75d3884e978f"} err="failed to get container status \"1865c132d3fb31fe70531d6c03bf6150c6ae1febaff26ecc903c75d3884e978f\": rpc error: code = NotFound desc = could not find container \"1865c132d3fb31fe70531d6c03bf6150c6ae1febaff26ecc903c75d3884e978f\": container with ID starting with 1865c132d3fb31fe70531d6c03bf6150c6ae1febaff26ecc903c75d3884e978f not found: ID does not exist" Dec 09 15:06:16 crc kubenswrapper[4555]: I1209 15:06:16.437470 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-c7f86c659-l8nlg"] Dec 09 15:06:18 crc kubenswrapper[4555]: I1209 15:06:18.165890 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eef1d965-488b-465b-bc5b-ce417339682b" path="/var/lib/kubelet/pods/eef1d965-488b-465b-bc5b-ce417339682b/volumes" Dec 09 15:06:55 crc kubenswrapper[4555]: I1209 15:06:55.531286 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:06:55 crc kubenswrapper[4555]: I1209 15:06:55.531669 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:07:25 crc kubenswrapper[4555]: I1209 15:07:25.531124 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:07:25 crc kubenswrapper[4555]: I1209 15:07:25.531455 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:07:55 crc kubenswrapper[4555]: I1209 15:07:55.531445 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:07:55 crc kubenswrapper[4555]: I1209 15:07:55.531770 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:07:55 crc kubenswrapper[4555]: I1209 15:07:55.531804 4555 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" Dec 09 15:07:55 crc kubenswrapper[4555]: I1209 15:07:55.532165 4555 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8dd058b28813c68e71df78106624226378245df5fe593a53aa3d62ba3999ec86"} pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 15:07:55 crc kubenswrapper[4555]: I1209 15:07:55.532211 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" containerID="cri-o://8dd058b28813c68e71df78106624226378245df5fe593a53aa3d62ba3999ec86" gracePeriod=600 Dec 09 15:07:55 crc kubenswrapper[4555]: I1209 15:07:55.826731 4555 generic.go:334] "Generic (PLEG): container finished" podID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerID="8dd058b28813c68e71df78106624226378245df5fe593a53aa3d62ba3999ec86" exitCode=0 Dec 09 15:07:55 crc kubenswrapper[4555]: I1209 15:07:55.826783 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" event={"ID":"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36","Type":"ContainerDied","Data":"8dd058b28813c68e71df78106624226378245df5fe593a53aa3d62ba3999ec86"} Dec 09 15:07:55 crc kubenswrapper[4555]: I1209 15:07:55.826959 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" event={"ID":"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36","Type":"ContainerStarted","Data":"1349c5c8b355954a96a1c092ad88f43ece6bf94e84e0fb6591265f020c3fe076"} Dec 09 15:07:55 crc kubenswrapper[4555]: I1209 15:07:55.826981 4555 scope.go:117] "RemoveContainer" containerID="180dd851cd6ac7ce99763c65c8a9c37630bce1147ad5777bcc34e7fd32e66125" Dec 09 15:09:29 crc kubenswrapper[4555]: I1209 15:09:29.936885 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n"] Dec 09 15:09:29 crc kubenswrapper[4555]: E1209 15:09:29.937422 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eef1d965-488b-465b-bc5b-ce417339682b" containerName="console" Dec 09 15:09:29 crc kubenswrapper[4555]: I1209 15:09:29.937435 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="eef1d965-488b-465b-bc5b-ce417339682b" containerName="console" Dec 09 15:09:29 crc kubenswrapper[4555]: I1209 15:09:29.937526 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="eef1d965-488b-465b-bc5b-ce417339682b" containerName="console" Dec 09 15:09:29 crc kubenswrapper[4555]: I1209 15:09:29.938160 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n" Dec 09 15:09:29 crc kubenswrapper[4555]: I1209 15:09:29.939398 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 09 15:09:29 crc kubenswrapper[4555]: I1209 15:09:29.946580 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n"] Dec 09 15:09:29 crc kubenswrapper[4555]: I1209 15:09:29.977658 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7mpv\" (UniqueName: \"kubernetes.io/projected/cfc81043-e54b-4a52-ad3a-6e10344badc1-kube-api-access-k7mpv\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n\" (UID: \"cfc81043-e54b-4a52-ad3a-6e10344badc1\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n" Dec 09 15:09:29 crc kubenswrapper[4555]: I1209 15:09:29.977732 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cfc81043-e54b-4a52-ad3a-6e10344badc1-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n\" (UID: \"cfc81043-e54b-4a52-ad3a-6e10344badc1\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n" Dec 09 15:09:29 crc kubenswrapper[4555]: I1209 15:09:29.977843 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cfc81043-e54b-4a52-ad3a-6e10344badc1-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n\" (UID: \"cfc81043-e54b-4a52-ad3a-6e10344badc1\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n" Dec 09 15:09:30 crc kubenswrapper[4555]: I1209 15:09:30.079078 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7mpv\" (UniqueName: \"kubernetes.io/projected/cfc81043-e54b-4a52-ad3a-6e10344badc1-kube-api-access-k7mpv\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n\" (UID: \"cfc81043-e54b-4a52-ad3a-6e10344badc1\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n" Dec 09 15:09:30 crc kubenswrapper[4555]: I1209 15:09:30.079152 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cfc81043-e54b-4a52-ad3a-6e10344badc1-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n\" (UID: \"cfc81043-e54b-4a52-ad3a-6e10344badc1\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n" Dec 09 15:09:30 crc kubenswrapper[4555]: I1209 15:09:30.079205 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cfc81043-e54b-4a52-ad3a-6e10344badc1-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n\" (UID: \"cfc81043-e54b-4a52-ad3a-6e10344badc1\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n" Dec 09 15:09:30 crc kubenswrapper[4555]: I1209 15:09:30.079608 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cfc81043-e54b-4a52-ad3a-6e10344badc1-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n\" (UID: \"cfc81043-e54b-4a52-ad3a-6e10344badc1\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n" Dec 09 15:09:30 crc kubenswrapper[4555]: I1209 15:09:30.079639 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cfc81043-e54b-4a52-ad3a-6e10344badc1-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n\" (UID: \"cfc81043-e54b-4a52-ad3a-6e10344badc1\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n" Dec 09 15:09:30 crc kubenswrapper[4555]: I1209 15:09:30.093812 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7mpv\" (UniqueName: \"kubernetes.io/projected/cfc81043-e54b-4a52-ad3a-6e10344badc1-kube-api-access-k7mpv\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n\" (UID: \"cfc81043-e54b-4a52-ad3a-6e10344badc1\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n" Dec 09 15:09:30 crc kubenswrapper[4555]: I1209 15:09:30.250461 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n" Dec 09 15:09:30 crc kubenswrapper[4555]: I1209 15:09:30.584947 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n"] Dec 09 15:09:31 crc kubenswrapper[4555]: I1209 15:09:31.275430 4555 generic.go:334] "Generic (PLEG): container finished" podID="cfc81043-e54b-4a52-ad3a-6e10344badc1" containerID="224573e97f6e8403be92b476e78bd6de50de8476e5cfe0d11a21cba946667721" exitCode=0 Dec 09 15:09:31 crc kubenswrapper[4555]: I1209 15:09:31.275618 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n" event={"ID":"cfc81043-e54b-4a52-ad3a-6e10344badc1","Type":"ContainerDied","Data":"224573e97f6e8403be92b476e78bd6de50de8476e5cfe0d11a21cba946667721"} Dec 09 15:09:31 crc kubenswrapper[4555]: I1209 15:09:31.275663 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n" event={"ID":"cfc81043-e54b-4a52-ad3a-6e10344badc1","Type":"ContainerStarted","Data":"83071921efdf3438b33c51bd90231ac555d74574312521522ed05969b5871560"} Dec 09 15:09:31 crc kubenswrapper[4555]: I1209 15:09:31.276773 4555 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 15:09:33 crc kubenswrapper[4555]: I1209 15:09:33.286576 4555 generic.go:334] "Generic (PLEG): container finished" podID="cfc81043-e54b-4a52-ad3a-6e10344badc1" containerID="f146247d5ca38c283edd3e848288df1aef732e27993c310874fd1140a262bb43" exitCode=0 Dec 09 15:09:33 crc kubenswrapper[4555]: I1209 15:09:33.286617 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n" event={"ID":"cfc81043-e54b-4a52-ad3a-6e10344badc1","Type":"ContainerDied","Data":"f146247d5ca38c283edd3e848288df1aef732e27993c310874fd1140a262bb43"} Dec 09 15:09:34 crc kubenswrapper[4555]: I1209 15:09:34.293371 4555 generic.go:334] "Generic (PLEG): container finished" podID="cfc81043-e54b-4a52-ad3a-6e10344badc1" containerID="b802358f4c33fc0bf38b12cff673b1b54ed3e631daa63d996c99d67ac34ee270" exitCode=0 Dec 09 15:09:34 crc kubenswrapper[4555]: I1209 15:09:34.293451 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n" event={"ID":"cfc81043-e54b-4a52-ad3a-6e10344badc1","Type":"ContainerDied","Data":"b802358f4c33fc0bf38b12cff673b1b54ed3e631daa63d996c99d67ac34ee270"} Dec 09 15:09:35 crc kubenswrapper[4555]: I1209 15:09:35.463161 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n" Dec 09 15:09:35 crc kubenswrapper[4555]: I1209 15:09:35.550557 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cfc81043-e54b-4a52-ad3a-6e10344badc1-util\") pod \"cfc81043-e54b-4a52-ad3a-6e10344badc1\" (UID: \"cfc81043-e54b-4a52-ad3a-6e10344badc1\") " Dec 09 15:09:35 crc kubenswrapper[4555]: I1209 15:09:35.550611 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7mpv\" (UniqueName: \"kubernetes.io/projected/cfc81043-e54b-4a52-ad3a-6e10344badc1-kube-api-access-k7mpv\") pod \"cfc81043-e54b-4a52-ad3a-6e10344badc1\" (UID: \"cfc81043-e54b-4a52-ad3a-6e10344badc1\") " Dec 09 15:09:35 crc kubenswrapper[4555]: I1209 15:09:35.550663 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cfc81043-e54b-4a52-ad3a-6e10344badc1-bundle\") pod \"cfc81043-e54b-4a52-ad3a-6e10344badc1\" (UID: \"cfc81043-e54b-4a52-ad3a-6e10344badc1\") " Dec 09 15:09:35 crc kubenswrapper[4555]: I1209 15:09:35.552430 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfc81043-e54b-4a52-ad3a-6e10344badc1-bundle" (OuterVolumeSpecName: "bundle") pod "cfc81043-e54b-4a52-ad3a-6e10344badc1" (UID: "cfc81043-e54b-4a52-ad3a-6e10344badc1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:09:35 crc kubenswrapper[4555]: I1209 15:09:35.555009 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfc81043-e54b-4a52-ad3a-6e10344badc1-kube-api-access-k7mpv" (OuterVolumeSpecName: "kube-api-access-k7mpv") pod "cfc81043-e54b-4a52-ad3a-6e10344badc1" (UID: "cfc81043-e54b-4a52-ad3a-6e10344badc1"). InnerVolumeSpecName "kube-api-access-k7mpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:09:35 crc kubenswrapper[4555]: I1209 15:09:35.560397 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfc81043-e54b-4a52-ad3a-6e10344badc1-util" (OuterVolumeSpecName: "util") pod "cfc81043-e54b-4a52-ad3a-6e10344badc1" (UID: "cfc81043-e54b-4a52-ad3a-6e10344badc1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:09:35 crc kubenswrapper[4555]: I1209 15:09:35.652151 4555 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cfc81043-e54b-4a52-ad3a-6e10344badc1-util\") on node \"crc\" DevicePath \"\"" Dec 09 15:09:35 crc kubenswrapper[4555]: I1209 15:09:35.652292 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7mpv\" (UniqueName: \"kubernetes.io/projected/cfc81043-e54b-4a52-ad3a-6e10344badc1-kube-api-access-k7mpv\") on node \"crc\" DevicePath \"\"" Dec 09 15:09:35 crc kubenswrapper[4555]: I1209 15:09:35.652323 4555 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cfc81043-e54b-4a52-ad3a-6e10344badc1-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 15:09:36 crc kubenswrapper[4555]: I1209 15:09:36.303631 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n" event={"ID":"cfc81043-e54b-4a52-ad3a-6e10344badc1","Type":"ContainerDied","Data":"83071921efdf3438b33c51bd90231ac555d74574312521522ed05969b5871560"} Dec 09 15:09:36 crc kubenswrapper[4555]: I1209 15:09:36.303671 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n" Dec 09 15:09:36 crc kubenswrapper[4555]: I1209 15:09:36.303679 4555 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83071921efdf3438b33c51bd90231ac555d74574312521522ed05969b5871560" Dec 09 15:09:43 crc kubenswrapper[4555]: I1209 15:09:43.875075 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-p4bl4"] Dec 09 15:09:43 crc kubenswrapper[4555]: I1209 15:09:43.875689 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="ovn-controller" containerID="cri-o://64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07" gracePeriod=30 Dec 09 15:09:43 crc kubenswrapper[4555]: I1209 15:09:43.875980 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="sbdb" containerID="cri-o://d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46" gracePeriod=30 Dec 09 15:09:43 crc kubenswrapper[4555]: I1209 15:09:43.876018 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="nbdb" containerID="cri-o://a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4" gracePeriod=30 Dec 09 15:09:43 crc kubenswrapper[4555]: I1209 15:09:43.876048 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="northd" containerID="cri-o://ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25" gracePeriod=30 Dec 09 15:09:43 crc kubenswrapper[4555]: I1209 15:09:43.876079 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d" gracePeriod=30 Dec 09 15:09:43 crc kubenswrapper[4555]: I1209 15:09:43.876106 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="kube-rbac-proxy-node" containerID="cri-o://4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac" gracePeriod=30 Dec 09 15:09:43 crc kubenswrapper[4555]: I1209 15:09:43.876135 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="ovn-acl-logging" containerID="cri-o://932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5" gracePeriod=30 Dec 09 15:09:43 crc kubenswrapper[4555]: I1209 15:09:43.926444 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="ovnkube-controller" containerID="cri-o://57d2adc84a2cd1f31fcedce27250e4e7255a9cc27f0d8422591d743dfb996466" gracePeriod=30 Dec 09 15:09:44 crc kubenswrapper[4555]: I1209 15:09:44.337064 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p4bl4_c87d128c-39da-4880-9efc-7d2e6319a854/ovnkube-controller/3.log" Dec 09 15:09:44 crc kubenswrapper[4555]: I1209 15:09:44.339023 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p4bl4_c87d128c-39da-4880-9efc-7d2e6319a854/ovn-acl-logging/0.log" Dec 09 15:09:44 crc kubenswrapper[4555]: I1209 15:09:44.339352 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p4bl4_c87d128c-39da-4880-9efc-7d2e6319a854/ovn-controller/0.log" Dec 09 15:09:44 crc kubenswrapper[4555]: I1209 15:09:44.339640 4555 generic.go:334] "Generic (PLEG): container finished" podID="c87d128c-39da-4880-9efc-7d2e6319a854" containerID="57d2adc84a2cd1f31fcedce27250e4e7255a9cc27f0d8422591d743dfb996466" exitCode=0 Dec 09 15:09:44 crc kubenswrapper[4555]: I1209 15:09:44.339661 4555 generic.go:334] "Generic (PLEG): container finished" podID="c87d128c-39da-4880-9efc-7d2e6319a854" containerID="d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46" exitCode=0 Dec 09 15:09:44 crc kubenswrapper[4555]: I1209 15:09:44.339670 4555 generic.go:334] "Generic (PLEG): container finished" podID="c87d128c-39da-4880-9efc-7d2e6319a854" containerID="a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4" exitCode=0 Dec 09 15:09:44 crc kubenswrapper[4555]: I1209 15:09:44.339676 4555 generic.go:334] "Generic (PLEG): container finished" podID="c87d128c-39da-4880-9efc-7d2e6319a854" containerID="ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25" exitCode=0 Dec 09 15:09:44 crc kubenswrapper[4555]: I1209 15:09:44.339681 4555 generic.go:334] "Generic (PLEG): container finished" podID="c87d128c-39da-4880-9efc-7d2e6319a854" containerID="932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5" exitCode=143 Dec 09 15:09:44 crc kubenswrapper[4555]: I1209 15:09:44.339687 4555 generic.go:334] "Generic (PLEG): container finished" podID="c87d128c-39da-4880-9efc-7d2e6319a854" containerID="64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07" exitCode=143 Dec 09 15:09:44 crc kubenswrapper[4555]: I1209 15:09:44.339728 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" event={"ID":"c87d128c-39da-4880-9efc-7d2e6319a854","Type":"ContainerDied","Data":"57d2adc84a2cd1f31fcedce27250e4e7255a9cc27f0d8422591d743dfb996466"} Dec 09 15:09:44 crc kubenswrapper[4555]: I1209 15:09:44.339749 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" event={"ID":"c87d128c-39da-4880-9efc-7d2e6319a854","Type":"ContainerDied","Data":"d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46"} Dec 09 15:09:44 crc kubenswrapper[4555]: I1209 15:09:44.339759 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" event={"ID":"c87d128c-39da-4880-9efc-7d2e6319a854","Type":"ContainerDied","Data":"a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4"} Dec 09 15:09:44 crc kubenswrapper[4555]: I1209 15:09:44.339767 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" event={"ID":"c87d128c-39da-4880-9efc-7d2e6319a854","Type":"ContainerDied","Data":"ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25"} Dec 09 15:09:44 crc kubenswrapper[4555]: I1209 15:09:44.339776 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" event={"ID":"c87d128c-39da-4880-9efc-7d2e6319a854","Type":"ContainerDied","Data":"932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5"} Dec 09 15:09:44 crc kubenswrapper[4555]: I1209 15:09:44.339784 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" event={"ID":"c87d128c-39da-4880-9efc-7d2e6319a854","Type":"ContainerDied","Data":"64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07"} Dec 09 15:09:44 crc kubenswrapper[4555]: I1209 15:09:44.339797 4555 scope.go:117] "RemoveContainer" containerID="f2496bb1d7c5b48f9944b2026978276ba9cf676ecd42dff619698a1ab8776269" Dec 09 15:09:44 crc kubenswrapper[4555]: I1209 15:09:44.341616 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gjjcm_3e34542c-29e8-43d3-add3-42aa080e0c8a/kube-multus/2.log" Dec 09 15:09:44 crc kubenswrapper[4555]: I1209 15:09:44.341952 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gjjcm_3e34542c-29e8-43d3-add3-42aa080e0c8a/kube-multus/1.log" Dec 09 15:09:44 crc kubenswrapper[4555]: I1209 15:09:44.341978 4555 generic.go:334] "Generic (PLEG): container finished" podID="3e34542c-29e8-43d3-add3-42aa080e0c8a" containerID="fddfb072bd733f5c700d405bf80dcbbe045961e870311c7880dc5d639fdb0059" exitCode=2 Dec 09 15:09:44 crc kubenswrapper[4555]: I1209 15:09:44.341995 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gjjcm" event={"ID":"3e34542c-29e8-43d3-add3-42aa080e0c8a","Type":"ContainerDied","Data":"fddfb072bd733f5c700d405bf80dcbbe045961e870311c7880dc5d639fdb0059"} Dec 09 15:09:44 crc kubenswrapper[4555]: I1209 15:09:44.342370 4555 scope.go:117] "RemoveContainer" containerID="fddfb072bd733f5c700d405bf80dcbbe045961e870311c7880dc5d639fdb0059" Dec 09 15:09:44 crc kubenswrapper[4555]: E1209 15:09:44.342543 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-gjjcm_openshift-multus(3e34542c-29e8-43d3-add3-42aa080e0c8a)\"" pod="openshift-multus/multus-gjjcm" podUID="3e34542c-29e8-43d3-add3-42aa080e0c8a" Dec 09 15:09:44 crc kubenswrapper[4555]: I1209 15:09:44.355745 4555 scope.go:117] "RemoveContainer" containerID="73a16b54312a5c81ee3d5b4753cd53c9a01cc7f174b340b1f0db44b019541fbd" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.107136 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p4bl4_c87d128c-39da-4880-9efc-7d2e6319a854/ovn-acl-logging/0.log" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.107636 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p4bl4_c87d128c-39da-4880-9efc-7d2e6319a854/ovn-controller/0.log" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.107924 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.144505 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4zlc7"] Dec 09 15:09:45 crc kubenswrapper[4555]: E1209 15:09:45.144681 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="kubecfg-setup" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.144708 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="kubecfg-setup" Dec 09 15:09:45 crc kubenswrapper[4555]: E1209 15:09:45.144722 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="ovn-controller" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.144746 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="ovn-controller" Dec 09 15:09:45 crc kubenswrapper[4555]: E1209 15:09:45.144754 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="nbdb" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.144759 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="nbdb" Dec 09 15:09:45 crc kubenswrapper[4555]: E1209 15:09:45.144767 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfc81043-e54b-4a52-ad3a-6e10344badc1" containerName="pull" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.144773 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfc81043-e54b-4a52-ad3a-6e10344badc1" containerName="pull" Dec 09 15:09:45 crc kubenswrapper[4555]: E1209 15:09:45.144782 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="ovn-acl-logging" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.144788 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="ovn-acl-logging" Dec 09 15:09:45 crc kubenswrapper[4555]: E1209 15:09:45.144797 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="ovnkube-controller" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.144802 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="ovnkube-controller" Dec 09 15:09:45 crc kubenswrapper[4555]: E1209 15:09:45.144809 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfc81043-e54b-4a52-ad3a-6e10344badc1" containerName="extract" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.144815 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfc81043-e54b-4a52-ad3a-6e10344badc1" containerName="extract" Dec 09 15:09:45 crc kubenswrapper[4555]: E1209 15:09:45.144822 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="sbdb" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.144827 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="sbdb" Dec 09 15:09:45 crc kubenswrapper[4555]: E1209 15:09:45.144834 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="ovnkube-controller" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.144839 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="ovnkube-controller" Dec 09 15:09:45 crc kubenswrapper[4555]: E1209 15:09:45.144844 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfc81043-e54b-4a52-ad3a-6e10344badc1" containerName="util" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.144849 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfc81043-e54b-4a52-ad3a-6e10344badc1" containerName="util" Dec 09 15:09:45 crc kubenswrapper[4555]: E1209 15:09:45.144854 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="ovnkube-controller" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.144859 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="ovnkube-controller" Dec 09 15:09:45 crc kubenswrapper[4555]: E1209 15:09:45.144866 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="ovnkube-controller" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.144871 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="ovnkube-controller" Dec 09 15:09:45 crc kubenswrapper[4555]: E1209 15:09:45.144880 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="kube-rbac-proxy-ovn-metrics" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.144885 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="kube-rbac-proxy-ovn-metrics" Dec 09 15:09:45 crc kubenswrapper[4555]: E1209 15:09:45.144893 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="northd" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.144898 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="northd" Dec 09 15:09:45 crc kubenswrapper[4555]: E1209 15:09:45.144906 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="kube-rbac-proxy-node" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.144911 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="kube-rbac-proxy-node" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.145000 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfc81043-e54b-4a52-ad3a-6e10344badc1" containerName="extract" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.145010 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="sbdb" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.145017 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="ovnkube-controller" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.145023 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="kube-rbac-proxy-node" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.145029 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="nbdb" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.145036 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="ovn-controller" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.145043 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="ovn-acl-logging" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.145049 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="kube-rbac-proxy-ovn-metrics" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.145057 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="northd" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.145063 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="ovnkube-controller" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.145074 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="ovnkube-controller" Dec 09 15:09:45 crc kubenswrapper[4555]: E1209 15:09:45.145153 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="ovnkube-controller" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.145159 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="ovnkube-controller" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.145234 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="ovnkube-controller" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.145409 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" containerName="ovnkube-controller" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.146675 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.156103 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c87d128c-39da-4880-9efc-7d2e6319a854-ovn-node-metrics-cert\") pod \"c87d128c-39da-4880-9efc-7d2e6319a854\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.156240 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "c87d128c-39da-4880-9efc-7d2e6319a854" (UID: "c87d128c-39da-4880-9efc-7d2e6319a854"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.156787 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-run-netns\") pod \"c87d128c-39da-4880-9efc-7d2e6319a854\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.156821 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-etc-openvswitch\") pod \"c87d128c-39da-4880-9efc-7d2e6319a854\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.156855 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c87d128c-39da-4880-9efc-7d2e6319a854-env-overrides\") pod \"c87d128c-39da-4880-9efc-7d2e6319a854\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.156930 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "c87d128c-39da-4880-9efc-7d2e6319a854" (UID: "c87d128c-39da-4880-9efc-7d2e6319a854"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.156956 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-run-systemd\") pod \"c87d128c-39da-4880-9efc-7d2e6319a854\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.156980 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-run-openvswitch\") pod \"c87d128c-39da-4880-9efc-7d2e6319a854\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157004 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c87d128c-39da-4880-9efc-7d2e6319a854-ovnkube-script-lib\") pod \"c87d128c-39da-4880-9efc-7d2e6319a854\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157025 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c87d128c-39da-4880-9efc-7d2e6319a854-ovnkube-config\") pod \"c87d128c-39da-4880-9efc-7d2e6319a854\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157043 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-var-lib-openvswitch\") pod \"c87d128c-39da-4880-9efc-7d2e6319a854\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157056 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-run-ovn\") pod \"c87d128c-39da-4880-9efc-7d2e6319a854\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157073 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-node-log\") pod \"c87d128c-39da-4880-9efc-7d2e6319a854\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157096 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-cni-netd\") pod \"c87d128c-39da-4880-9efc-7d2e6319a854\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157126 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-systemd-units\") pod \"c87d128c-39da-4880-9efc-7d2e6319a854\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157139 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-log-socket\") pod \"c87d128c-39da-4880-9efc-7d2e6319a854\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157172 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7p8n9\" (UniqueName: \"kubernetes.io/projected/c87d128c-39da-4880-9efc-7d2e6319a854-kube-api-access-7p8n9\") pod \"c87d128c-39da-4880-9efc-7d2e6319a854\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157186 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-cni-bin\") pod \"c87d128c-39da-4880-9efc-7d2e6319a854\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157205 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-kubelet\") pod \"c87d128c-39da-4880-9efc-7d2e6319a854\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157216 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-slash\") pod \"c87d128c-39da-4880-9efc-7d2e6319a854\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157215 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c87d128c-39da-4880-9efc-7d2e6319a854-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "c87d128c-39da-4880-9efc-7d2e6319a854" (UID: "c87d128c-39da-4880-9efc-7d2e6319a854"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157235 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-var-lib-cni-networks-ovn-kubernetes\") pod \"c87d128c-39da-4880-9efc-7d2e6319a854\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157246 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-node-log" (OuterVolumeSpecName: "node-log") pod "c87d128c-39da-4880-9efc-7d2e6319a854" (UID: "c87d128c-39da-4880-9efc-7d2e6319a854"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157250 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-run-ovn-kubernetes\") pod \"c87d128c-39da-4880-9efc-7d2e6319a854\" (UID: \"c87d128c-39da-4880-9efc-7d2e6319a854\") " Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157340 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "c87d128c-39da-4880-9efc-7d2e6319a854" (UID: "c87d128c-39da-4880-9efc-7d2e6319a854"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157371 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "c87d128c-39da-4880-9efc-7d2e6319a854" (UID: "c87d128c-39da-4880-9efc-7d2e6319a854"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157398 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "c87d128c-39da-4880-9efc-7d2e6319a854" (UID: "c87d128c-39da-4880-9efc-7d2e6319a854"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157417 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "c87d128c-39da-4880-9efc-7d2e6319a854" (UID: "c87d128c-39da-4880-9efc-7d2e6319a854"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157431 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "c87d128c-39da-4880-9efc-7d2e6319a854" (UID: "c87d128c-39da-4880-9efc-7d2e6319a854"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157448 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-log-socket" (OuterVolumeSpecName: "log-socket") pod "c87d128c-39da-4880-9efc-7d2e6319a854" (UID: "c87d128c-39da-4880-9efc-7d2e6319a854"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157419 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "c87d128c-39da-4880-9efc-7d2e6319a854" (UID: "c87d128c-39da-4880-9efc-7d2e6319a854"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157466 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "c87d128c-39da-4880-9efc-7d2e6319a854" (UID: "c87d128c-39da-4880-9efc-7d2e6319a854"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157496 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "c87d128c-39da-4880-9efc-7d2e6319a854" (UID: "c87d128c-39da-4880-9efc-7d2e6319a854"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157503 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-slash" (OuterVolumeSpecName: "host-slash") pod "c87d128c-39da-4880-9efc-7d2e6319a854" (UID: "c87d128c-39da-4880-9efc-7d2e6319a854"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157566 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c87d128c-39da-4880-9efc-7d2e6319a854-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "c87d128c-39da-4880-9efc-7d2e6319a854" (UID: "c87d128c-39da-4880-9efc-7d2e6319a854"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157754 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c87d128c-39da-4880-9efc-7d2e6319a854-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "c87d128c-39da-4880-9efc-7d2e6319a854" (UID: "c87d128c-39da-4880-9efc-7d2e6319a854"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157756 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "c87d128c-39da-4880-9efc-7d2e6319a854" (UID: "c87d128c-39da-4880-9efc-7d2e6319a854"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157889 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-host-run-netns\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157921 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-host-kubelet\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157949 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-env-overrides\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157966 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-host-cni-netd\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.157986 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62mwj\" (UniqueName: \"kubernetes.io/projected/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-kube-api-access-62mwj\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158013 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-host-cni-bin\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158031 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158078 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-log-socket\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158095 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-ovnkube-script-lib\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158107 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-run-systemd\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158131 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-var-lib-openvswitch\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158145 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-etc-openvswitch\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158159 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-host-run-ovn-kubernetes\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158175 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-ovnkube-config\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158188 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-run-ovn\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158203 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-ovn-node-metrics-cert\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158222 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-systemd-units\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158236 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-node-log\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158272 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-run-openvswitch\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158308 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-host-slash\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158358 4555 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158368 4555 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-log-socket\") on node \"crc\" DevicePath \"\"" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158375 4555 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158382 4555 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158389 4555 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-slash\") on node \"crc\" DevicePath \"\"" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158397 4555 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158406 4555 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158413 4555 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158421 4555 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158428 4555 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c87d128c-39da-4880-9efc-7d2e6319a854-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158436 4555 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158446 4555 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c87d128c-39da-4880-9efc-7d2e6319a854-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158455 4555 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c87d128c-39da-4880-9efc-7d2e6319a854-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158462 4555 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158469 4555 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158557 4555 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-node-log\") on node \"crc\" DevicePath \"\"" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.158564 4555 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.172551 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c87d128c-39da-4880-9efc-7d2e6319a854-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "c87d128c-39da-4880-9efc-7d2e6319a854" (UID: "c87d128c-39da-4880-9efc-7d2e6319a854"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.177764 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c87d128c-39da-4880-9efc-7d2e6319a854-kube-api-access-7p8n9" (OuterVolumeSpecName: "kube-api-access-7p8n9") pod "c87d128c-39da-4880-9efc-7d2e6319a854" (UID: "c87d128c-39da-4880-9efc-7d2e6319a854"). InnerVolumeSpecName "kube-api-access-7p8n9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.177888 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "c87d128c-39da-4880-9efc-7d2e6319a854" (UID: "c87d128c-39da-4880-9efc-7d2e6319a854"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.259953 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-ovn-node-metrics-cert\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260001 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-systemd-units\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260018 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-node-log\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260076 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-run-openvswitch\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260094 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-host-slash\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260139 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-host-run-netns\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260164 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-host-kubelet\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260187 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-host-cni-netd\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260199 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-env-overrides\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260225 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62mwj\" (UniqueName: \"kubernetes.io/projected/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-kube-api-access-62mwj\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260250 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-host-cni-bin\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260279 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260352 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-log-socket\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260370 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-ovnkube-script-lib\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260386 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-run-systemd\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260419 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-var-lib-openvswitch\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260434 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-etc-openvswitch\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260449 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-host-run-ovn-kubernetes\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260462 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-ovnkube-config\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260476 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-run-ovn\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260516 4555 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c87d128c-39da-4880-9efc-7d2e6319a854-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260528 4555 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c87d128c-39da-4880-9efc-7d2e6319a854-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260539 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7p8n9\" (UniqueName: \"kubernetes.io/projected/c87d128c-39da-4880-9efc-7d2e6319a854-kube-api-access-7p8n9\") on node \"crc\" DevicePath \"\"" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260575 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-run-ovn\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260608 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-host-cni-netd\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260658 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-systemd-units\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260689 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-node-log\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260779 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-host-slash\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260846 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-run-openvswitch\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260862 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-log-socket\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.260923 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-var-lib-openvswitch\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.261086 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-env-overrides\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.261121 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-host-run-netns\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.261145 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-host-run-ovn-kubernetes\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.261129 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-etc-openvswitch\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.261252 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-run-systemd\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.261289 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.261340 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-host-kubelet\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.261476 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-host-cni-bin\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.261490 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-ovnkube-script-lib\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.261560 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-ovnkube-config\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.262934 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-ovn-node-metrics-cert\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.279970 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62mwj\" (UniqueName: \"kubernetes.io/projected/4a47d5ee-c928-4a04-8683-8fd4fe63ea9c-kube-api-access-62mwj\") pod \"ovnkube-node-4zlc7\" (UID: \"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.349817 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p4bl4_c87d128c-39da-4880-9efc-7d2e6319a854/ovn-acl-logging/0.log" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.350354 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p4bl4_c87d128c-39da-4880-9efc-7d2e6319a854/ovn-controller/0.log" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.350678 4555 generic.go:334] "Generic (PLEG): container finished" podID="c87d128c-39da-4880-9efc-7d2e6319a854" containerID="cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d" exitCode=0 Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.350708 4555 generic.go:334] "Generic (PLEG): container finished" podID="c87d128c-39da-4880-9efc-7d2e6319a854" containerID="4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac" exitCode=0 Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.350760 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" event={"ID":"c87d128c-39da-4880-9efc-7d2e6319a854","Type":"ContainerDied","Data":"cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d"} Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.350782 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" event={"ID":"c87d128c-39da-4880-9efc-7d2e6319a854","Type":"ContainerDied","Data":"4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac"} Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.350760 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.350801 4555 scope.go:117] "RemoveContainer" containerID="57d2adc84a2cd1f31fcedce27250e4e7255a9cc27f0d8422591d743dfb996466" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.350792 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4bl4" event={"ID":"c87d128c-39da-4880-9efc-7d2e6319a854","Type":"ContainerDied","Data":"a1ccd13de6fb64e2a009e568b924fd28d4b6edc43b474b7b6a7461ff1b7a1970"} Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.352835 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gjjcm_3e34542c-29e8-43d3-add3-42aa080e0c8a/kube-multus/2.log" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.364222 4555 scope.go:117] "RemoveContainer" containerID="d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.372625 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-p4bl4"] Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.375458 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-p4bl4"] Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.376983 4555 scope.go:117] "RemoveContainer" containerID="a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.387661 4555 scope.go:117] "RemoveContainer" containerID="ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.400652 4555 scope.go:117] "RemoveContainer" containerID="cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.413239 4555 scope.go:117] "RemoveContainer" containerID="4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.434058 4555 scope.go:117] "RemoveContainer" containerID="932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.444187 4555 scope.go:117] "RemoveContainer" containerID="64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.457648 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.458891 4555 scope.go:117] "RemoveContainer" containerID="c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.474466 4555 scope.go:117] "RemoveContainer" containerID="57d2adc84a2cd1f31fcedce27250e4e7255a9cc27f0d8422591d743dfb996466" Dec 09 15:09:45 crc kubenswrapper[4555]: E1209 15:09:45.474800 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57d2adc84a2cd1f31fcedce27250e4e7255a9cc27f0d8422591d743dfb996466\": container with ID starting with 57d2adc84a2cd1f31fcedce27250e4e7255a9cc27f0d8422591d743dfb996466 not found: ID does not exist" containerID="57d2adc84a2cd1f31fcedce27250e4e7255a9cc27f0d8422591d743dfb996466" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.474827 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57d2adc84a2cd1f31fcedce27250e4e7255a9cc27f0d8422591d743dfb996466"} err="failed to get container status \"57d2adc84a2cd1f31fcedce27250e4e7255a9cc27f0d8422591d743dfb996466\": rpc error: code = NotFound desc = could not find container \"57d2adc84a2cd1f31fcedce27250e4e7255a9cc27f0d8422591d743dfb996466\": container with ID starting with 57d2adc84a2cd1f31fcedce27250e4e7255a9cc27f0d8422591d743dfb996466 not found: ID does not exist" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.474848 4555 scope.go:117] "RemoveContainer" containerID="d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46" Dec 09 15:09:45 crc kubenswrapper[4555]: E1209 15:09:45.475016 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46\": container with ID starting with d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46 not found: ID does not exist" containerID="d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.475036 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46"} err="failed to get container status \"d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46\": rpc error: code = NotFound desc = could not find container \"d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46\": container with ID starting with d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46 not found: ID does not exist" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.475049 4555 scope.go:117] "RemoveContainer" containerID="a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4" Dec 09 15:09:45 crc kubenswrapper[4555]: E1209 15:09:45.475232 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4\": container with ID starting with a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4 not found: ID does not exist" containerID="a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.475251 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4"} err="failed to get container status \"a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4\": rpc error: code = NotFound desc = could not find container \"a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4\": container with ID starting with a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4 not found: ID does not exist" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.475262 4555 scope.go:117] "RemoveContainer" containerID="ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25" Dec 09 15:09:45 crc kubenswrapper[4555]: E1209 15:09:45.475410 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25\": container with ID starting with ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25 not found: ID does not exist" containerID="ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.475428 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25"} err="failed to get container status \"ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25\": rpc error: code = NotFound desc = could not find container \"ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25\": container with ID starting with ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25 not found: ID does not exist" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.475439 4555 scope.go:117] "RemoveContainer" containerID="cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d" Dec 09 15:09:45 crc kubenswrapper[4555]: E1209 15:09:45.475591 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d\": container with ID starting with cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d not found: ID does not exist" containerID="cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.475608 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d"} err="failed to get container status \"cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d\": rpc error: code = NotFound desc = could not find container \"cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d\": container with ID starting with cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d not found: ID does not exist" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.475620 4555 scope.go:117] "RemoveContainer" containerID="4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac" Dec 09 15:09:45 crc kubenswrapper[4555]: E1209 15:09:45.475775 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac\": container with ID starting with 4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac not found: ID does not exist" containerID="4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.475792 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac"} err="failed to get container status \"4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac\": rpc error: code = NotFound desc = could not find container \"4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac\": container with ID starting with 4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac not found: ID does not exist" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.475804 4555 scope.go:117] "RemoveContainer" containerID="932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5" Dec 09 15:09:45 crc kubenswrapper[4555]: E1209 15:09:45.475945 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5\": container with ID starting with 932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5 not found: ID does not exist" containerID="932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.475962 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5"} err="failed to get container status \"932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5\": rpc error: code = NotFound desc = could not find container \"932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5\": container with ID starting with 932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5 not found: ID does not exist" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.475973 4555 scope.go:117] "RemoveContainer" containerID="64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07" Dec 09 15:09:45 crc kubenswrapper[4555]: E1209 15:09:45.476114 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07\": container with ID starting with 64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07 not found: ID does not exist" containerID="64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.476132 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07"} err="failed to get container status \"64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07\": rpc error: code = NotFound desc = could not find container \"64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07\": container with ID starting with 64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07 not found: ID does not exist" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.476144 4555 scope.go:117] "RemoveContainer" containerID="c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd" Dec 09 15:09:45 crc kubenswrapper[4555]: E1209 15:09:45.476281 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\": container with ID starting with c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd not found: ID does not exist" containerID="c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.476312 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd"} err="failed to get container status \"c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\": rpc error: code = NotFound desc = could not find container \"c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\": container with ID starting with c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd not found: ID does not exist" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.476323 4555 scope.go:117] "RemoveContainer" containerID="57d2adc84a2cd1f31fcedce27250e4e7255a9cc27f0d8422591d743dfb996466" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.476462 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57d2adc84a2cd1f31fcedce27250e4e7255a9cc27f0d8422591d743dfb996466"} err="failed to get container status \"57d2adc84a2cd1f31fcedce27250e4e7255a9cc27f0d8422591d743dfb996466\": rpc error: code = NotFound desc = could not find container \"57d2adc84a2cd1f31fcedce27250e4e7255a9cc27f0d8422591d743dfb996466\": container with ID starting with 57d2adc84a2cd1f31fcedce27250e4e7255a9cc27f0d8422591d743dfb996466 not found: ID does not exist" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.476480 4555 scope.go:117] "RemoveContainer" containerID="d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.476614 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46"} err="failed to get container status \"d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46\": rpc error: code = NotFound desc = could not find container \"d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46\": container with ID starting with d9dc3533604ee31d652a75a8c65146c35b320c41f2d4fe85c3c03cadb99a9d46 not found: ID does not exist" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.476630 4555 scope.go:117] "RemoveContainer" containerID="a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.476778 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4"} err="failed to get container status \"a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4\": rpc error: code = NotFound desc = could not find container \"a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4\": container with ID starting with a05bba24ff7f1d956fa0d3cb5949de61a278cddb4674bdb723749a2b005d82d4 not found: ID does not exist" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.476795 4555 scope.go:117] "RemoveContainer" containerID="ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.476931 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25"} err="failed to get container status \"ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25\": rpc error: code = NotFound desc = could not find container \"ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25\": container with ID starting with ec3d3a350dce6c9b5ea637d00abdaf43796a5595105dc82133f3d84349c12c25 not found: ID does not exist" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.476946 4555 scope.go:117] "RemoveContainer" containerID="cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.477074 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d"} err="failed to get container status \"cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d\": rpc error: code = NotFound desc = could not find container \"cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d\": container with ID starting with cd6cd4a3aeb4f2f1ad5ad6b22f80134a393ccb4c5b7e22c33d97324c8f70800d not found: ID does not exist" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.477091 4555 scope.go:117] "RemoveContainer" containerID="4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.477236 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac"} err="failed to get container status \"4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac\": rpc error: code = NotFound desc = could not find container \"4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac\": container with ID starting with 4dee71204e7a71c311143a6ed6fbde69968bc877053d6a400d252eaa010a88ac not found: ID does not exist" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.477253 4555 scope.go:117] "RemoveContainer" containerID="932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.477409 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5"} err="failed to get container status \"932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5\": rpc error: code = NotFound desc = could not find container \"932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5\": container with ID starting with 932166905ddf0fc44b9b7356a8f5efe68e1d44307d8e486946c8285ae969b8e5 not found: ID does not exist" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.477425 4555 scope.go:117] "RemoveContainer" containerID="64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.477564 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07"} err="failed to get container status \"64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07\": rpc error: code = NotFound desc = could not find container \"64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07\": container with ID starting with 64685d2778ef0cf9d316897042a9e9353603a0cc7a234baf821ec03b149aeb07 not found: ID does not exist" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.477580 4555 scope.go:117] "RemoveContainer" containerID="c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd" Dec 09 15:09:45 crc kubenswrapper[4555]: I1209 15:09:45.477720 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd"} err="failed to get container status \"c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\": rpc error: code = NotFound desc = could not find container \"c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd\": container with ID starting with c9ce6bcdccee7782ed6a2472052064ea5f0f9a5f25a8645c692165a2a4bf16dd not found: ID does not exist" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.166242 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c87d128c-39da-4880-9efc-7d2e6319a854" path="/var/lib/kubelet/pods/c87d128c-39da-4880-9efc-7d2e6319a854/volumes" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.237522 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-v4h96"] Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.238119 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-v4h96" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.239843 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.242815 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.242914 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-xq8kx" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.274034 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx4pm\" (UniqueName: \"kubernetes.io/projected/ce4b9ae1-9c56-467a-908e-3b990e17f1cd-kube-api-access-bx4pm\") pod \"obo-prometheus-operator-668cf9dfbb-v4h96\" (UID: \"ce4b9ae1-9c56-467a-908e-3b990e17f1cd\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-v4h96" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.353645 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g"] Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.354245 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.357440 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.357591 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-mrv65" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.359279 4555 generic.go:334] "Generic (PLEG): container finished" podID="4a47d5ee-c928-4a04-8683-8fd4fe63ea9c" containerID="9782926e8d112ff586b984d18a8f01dbd017c97ccc73b6f3cd6af10355fe82b8" exitCode=0 Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.359327 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" event={"ID":"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c","Type":"ContainerDied","Data":"9782926e8d112ff586b984d18a8f01dbd017c97ccc73b6f3cd6af10355fe82b8"} Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.359560 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" event={"ID":"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c","Type":"ContainerStarted","Data":"c4356ba7c925f9ff19811f2ef128bd81095297cfdd987a8e57ce14df74b91da3"} Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.362556 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc"] Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.363197 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.374854 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g\" (UID: \"42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.374908 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g\" (UID: \"42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.375187 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx4pm\" (UniqueName: \"kubernetes.io/projected/ce4b9ae1-9c56-467a-908e-3b990e17f1cd-kube-api-access-bx4pm\") pod \"obo-prometheus-operator-668cf9dfbb-v4h96\" (UID: \"ce4b9ae1-9c56-467a-908e-3b990e17f1cd\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-v4h96" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.402974 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx4pm\" (UniqueName: \"kubernetes.io/projected/ce4b9ae1-9c56-467a-908e-3b990e17f1cd-kube-api-access-bx4pm\") pod \"obo-prometheus-operator-668cf9dfbb-v4h96\" (UID: \"ce4b9ae1-9c56-467a-908e-3b990e17f1cd\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-v4h96" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.475877 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b486a37e-2de3-4639-b189-e69c501f9a6f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc\" (UID: \"b486a37e-2de3-4639-b189-e69c501f9a6f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.475973 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g\" (UID: \"42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.476073 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g\" (UID: \"42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.476181 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b486a37e-2de3-4639-b189-e69c501f9a6f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc\" (UID: \"b486a37e-2de3-4639-b189-e69c501f9a6f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.478569 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g\" (UID: \"42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.486093 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g\" (UID: \"42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.550755 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-v4h96" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.562570 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-7l249"] Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.563232 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-7l249" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.565645 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-6p8jl" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.566397 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 09 15:09:46 crc kubenswrapper[4555]: E1209 15:09:46.575275 4555 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-v4h96_openshift-operators_ce4b9ae1-9c56-467a-908e-3b990e17f1cd_0(216973ac57bd5ad9ece67f39bc48920ddbf777dec2a651b30f921813bc2d653f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:09:46 crc kubenswrapper[4555]: E1209 15:09:46.575345 4555 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-v4h96_openshift-operators_ce4b9ae1-9c56-467a-908e-3b990e17f1cd_0(216973ac57bd5ad9ece67f39bc48920ddbf777dec2a651b30f921813bc2d653f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-v4h96" Dec 09 15:09:46 crc kubenswrapper[4555]: E1209 15:09:46.575367 4555 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-v4h96_openshift-operators_ce4b9ae1-9c56-467a-908e-3b990e17f1cd_0(216973ac57bd5ad9ece67f39bc48920ddbf777dec2a651b30f921813bc2d653f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-v4h96" Dec 09 15:09:46 crc kubenswrapper[4555]: E1209 15:09:46.575419 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-v4h96_openshift-operators(ce4b9ae1-9c56-467a-908e-3b990e17f1cd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-v4h96_openshift-operators(ce4b9ae1-9c56-467a-908e-3b990e17f1cd)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-v4h96_openshift-operators_ce4b9ae1-9c56-467a-908e-3b990e17f1cd_0(216973ac57bd5ad9ece67f39bc48920ddbf777dec2a651b30f921813bc2d653f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-v4h96" podUID="ce4b9ae1-9c56-467a-908e-3b990e17f1cd" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.576905 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b486a37e-2de3-4639-b189-e69c501f9a6f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc\" (UID: \"b486a37e-2de3-4639-b189-e69c501f9a6f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.576958 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b486a37e-2de3-4639-b189-e69c501f9a6f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc\" (UID: \"b486a37e-2de3-4639-b189-e69c501f9a6f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.582203 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b486a37e-2de3-4639-b189-e69c501f9a6f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc\" (UID: \"b486a37e-2de3-4639-b189-e69c501f9a6f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.582717 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b486a37e-2de3-4639-b189-e69c501f9a6f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc\" (UID: \"b486a37e-2de3-4639-b189-e69c501f9a6f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.666076 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.674069 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.678262 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvr67\" (UniqueName: \"kubernetes.io/projected/2302fd31-2414-419b-9733-4d69ef2d00fe-kube-api-access-gvr67\") pod \"observability-operator-d8bb48f5d-7l249\" (UID: \"2302fd31-2414-419b-9733-4d69ef2d00fe\") " pod="openshift-operators/observability-operator-d8bb48f5d-7l249" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.678335 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/2302fd31-2414-419b-9733-4d69ef2d00fe-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-7l249\" (UID: \"2302fd31-2414-419b-9733-4d69ef2d00fe\") " pod="openshift-operators/observability-operator-d8bb48f5d-7l249" Dec 09 15:09:46 crc kubenswrapper[4555]: E1209 15:09:46.689443 4555 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g_openshift-operators_42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9_0(6d5dec0a7c93d40fe549e443705bb77c942028e788e839653d2122325e5db3c8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:09:46 crc kubenswrapper[4555]: E1209 15:09:46.689499 4555 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g_openshift-operators_42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9_0(6d5dec0a7c93d40fe549e443705bb77c942028e788e839653d2122325e5db3c8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g" Dec 09 15:09:46 crc kubenswrapper[4555]: E1209 15:09:46.689518 4555 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g_openshift-operators_42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9_0(6d5dec0a7c93d40fe549e443705bb77c942028e788e839653d2122325e5db3c8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g" Dec 09 15:09:46 crc kubenswrapper[4555]: E1209 15:09:46.689560 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g_openshift-operators(42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g_openshift-operators(42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g_openshift-operators_42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9_0(6d5dec0a7c93d40fe549e443705bb77c942028e788e839653d2122325e5db3c8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g" podUID="42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9" Dec 09 15:09:46 crc kubenswrapper[4555]: E1209 15:09:46.703550 4555 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc_openshift-operators_b486a37e-2de3-4639-b189-e69c501f9a6f_0(4f47b1bbbf4ec2866fa4658d8f08986e7cf4f5381b096145f158ac0a7e12b4ab): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:09:46 crc kubenswrapper[4555]: E1209 15:09:46.703613 4555 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc_openshift-operators_b486a37e-2de3-4639-b189-e69c501f9a6f_0(4f47b1bbbf4ec2866fa4658d8f08986e7cf4f5381b096145f158ac0a7e12b4ab): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc" Dec 09 15:09:46 crc kubenswrapper[4555]: E1209 15:09:46.703635 4555 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc_openshift-operators_b486a37e-2de3-4639-b189-e69c501f9a6f_0(4f47b1bbbf4ec2866fa4658d8f08986e7cf4f5381b096145f158ac0a7e12b4ab): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc" Dec 09 15:09:46 crc kubenswrapper[4555]: E1209 15:09:46.703669 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc_openshift-operators(b486a37e-2de3-4639-b189-e69c501f9a6f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc_openshift-operators(b486a37e-2de3-4639-b189-e69c501f9a6f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc_openshift-operators_b486a37e-2de3-4639-b189-e69c501f9a6f_0(4f47b1bbbf4ec2866fa4658d8f08986e7cf4f5381b096145f158ac0a7e12b4ab): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc" podUID="b486a37e-2de3-4639-b189-e69c501f9a6f" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.755004 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-8rffd"] Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.755643 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-8rffd" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.757526 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-gltqw" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.779752 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvr67\" (UniqueName: \"kubernetes.io/projected/2302fd31-2414-419b-9733-4d69ef2d00fe-kube-api-access-gvr67\") pod \"observability-operator-d8bb48f5d-7l249\" (UID: \"2302fd31-2414-419b-9733-4d69ef2d00fe\") " pod="openshift-operators/observability-operator-d8bb48f5d-7l249" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.779787 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/2302fd31-2414-419b-9733-4d69ef2d00fe-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-7l249\" (UID: \"2302fd31-2414-419b-9733-4d69ef2d00fe\") " pod="openshift-operators/observability-operator-d8bb48f5d-7l249" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.779812 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/7bd718ca-da15-4b6c-89ca-01b62bab1393-openshift-service-ca\") pod \"perses-operator-5446b9c989-8rffd\" (UID: \"7bd718ca-da15-4b6c-89ca-01b62bab1393\") " pod="openshift-operators/perses-operator-5446b9c989-8rffd" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.779851 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5tm8\" (UniqueName: \"kubernetes.io/projected/7bd718ca-da15-4b6c-89ca-01b62bab1393-kube-api-access-w5tm8\") pod \"perses-operator-5446b9c989-8rffd\" (UID: \"7bd718ca-da15-4b6c-89ca-01b62bab1393\") " pod="openshift-operators/perses-operator-5446b9c989-8rffd" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.783079 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/2302fd31-2414-419b-9733-4d69ef2d00fe-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-7l249\" (UID: \"2302fd31-2414-419b-9733-4d69ef2d00fe\") " pod="openshift-operators/observability-operator-d8bb48f5d-7l249" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.795525 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvr67\" (UniqueName: \"kubernetes.io/projected/2302fd31-2414-419b-9733-4d69ef2d00fe-kube-api-access-gvr67\") pod \"observability-operator-d8bb48f5d-7l249\" (UID: \"2302fd31-2414-419b-9733-4d69ef2d00fe\") " pod="openshift-operators/observability-operator-d8bb48f5d-7l249" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.874661 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-7l249" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.880989 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/7bd718ca-da15-4b6c-89ca-01b62bab1393-openshift-service-ca\") pod \"perses-operator-5446b9c989-8rffd\" (UID: \"7bd718ca-da15-4b6c-89ca-01b62bab1393\") " pod="openshift-operators/perses-operator-5446b9c989-8rffd" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.881073 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5tm8\" (UniqueName: \"kubernetes.io/projected/7bd718ca-da15-4b6c-89ca-01b62bab1393-kube-api-access-w5tm8\") pod \"perses-operator-5446b9c989-8rffd\" (UID: \"7bd718ca-da15-4b6c-89ca-01b62bab1393\") " pod="openshift-operators/perses-operator-5446b9c989-8rffd" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.882035 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/7bd718ca-da15-4b6c-89ca-01b62bab1393-openshift-service-ca\") pod \"perses-operator-5446b9c989-8rffd\" (UID: \"7bd718ca-da15-4b6c-89ca-01b62bab1393\") " pod="openshift-operators/perses-operator-5446b9c989-8rffd" Dec 09 15:09:46 crc kubenswrapper[4555]: I1209 15:09:46.893345 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5tm8\" (UniqueName: \"kubernetes.io/projected/7bd718ca-da15-4b6c-89ca-01b62bab1393-kube-api-access-w5tm8\") pod \"perses-operator-5446b9c989-8rffd\" (UID: \"7bd718ca-da15-4b6c-89ca-01b62bab1393\") " pod="openshift-operators/perses-operator-5446b9c989-8rffd" Dec 09 15:09:46 crc kubenswrapper[4555]: E1209 15:09:46.893457 4555 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-7l249_openshift-operators_2302fd31-2414-419b-9733-4d69ef2d00fe_0(8abfdae6d9989fc9c7b31c897446abf44208380cbc8354da341da23bb7503230): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:09:46 crc kubenswrapper[4555]: E1209 15:09:46.893490 4555 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-7l249_openshift-operators_2302fd31-2414-419b-9733-4d69ef2d00fe_0(8abfdae6d9989fc9c7b31c897446abf44208380cbc8354da341da23bb7503230): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-7l249" Dec 09 15:09:46 crc kubenswrapper[4555]: E1209 15:09:46.893509 4555 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-7l249_openshift-operators_2302fd31-2414-419b-9733-4d69ef2d00fe_0(8abfdae6d9989fc9c7b31c897446abf44208380cbc8354da341da23bb7503230): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-7l249" Dec 09 15:09:46 crc kubenswrapper[4555]: E1209 15:09:46.893556 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-7l249_openshift-operators(2302fd31-2414-419b-9733-4d69ef2d00fe)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-7l249_openshift-operators(2302fd31-2414-419b-9733-4d69ef2d00fe)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-7l249_openshift-operators_2302fd31-2414-419b-9733-4d69ef2d00fe_0(8abfdae6d9989fc9c7b31c897446abf44208380cbc8354da341da23bb7503230): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-7l249" podUID="2302fd31-2414-419b-9733-4d69ef2d00fe" Dec 09 15:09:47 crc kubenswrapper[4555]: I1209 15:09:47.066641 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-8rffd" Dec 09 15:09:47 crc kubenswrapper[4555]: E1209 15:09:47.100377 4555 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-8rffd_openshift-operators_7bd718ca-da15-4b6c-89ca-01b62bab1393_0(bbcce9684bebeb61d56c982852c364148966a24f0f02329988873f769126aca5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:09:47 crc kubenswrapper[4555]: E1209 15:09:47.100431 4555 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-8rffd_openshift-operators_7bd718ca-da15-4b6c-89ca-01b62bab1393_0(bbcce9684bebeb61d56c982852c364148966a24f0f02329988873f769126aca5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-8rffd" Dec 09 15:09:47 crc kubenswrapper[4555]: E1209 15:09:47.100449 4555 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-8rffd_openshift-operators_7bd718ca-da15-4b6c-89ca-01b62bab1393_0(bbcce9684bebeb61d56c982852c364148966a24f0f02329988873f769126aca5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-8rffd" Dec 09 15:09:47 crc kubenswrapper[4555]: E1209 15:09:47.100484 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-8rffd_openshift-operators(7bd718ca-da15-4b6c-89ca-01b62bab1393)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-8rffd_openshift-operators(7bd718ca-da15-4b6c-89ca-01b62bab1393)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-8rffd_openshift-operators_7bd718ca-da15-4b6c-89ca-01b62bab1393_0(bbcce9684bebeb61d56c982852c364148966a24f0f02329988873f769126aca5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-8rffd" podUID="7bd718ca-da15-4b6c-89ca-01b62bab1393" Dec 09 15:09:47 crc kubenswrapper[4555]: I1209 15:09:47.367476 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" event={"ID":"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c","Type":"ContainerStarted","Data":"1b95d726857ccb2df5cb570076fd08c92a533a3cb33c862aafd56e05a540c3c7"} Dec 09 15:09:47 crc kubenswrapper[4555]: I1209 15:09:47.367512 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" event={"ID":"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c","Type":"ContainerStarted","Data":"028ff9847bd2289b21d10c0b68b1a48f2bc84456df624f270a4f06df5f36665f"} Dec 09 15:09:47 crc kubenswrapper[4555]: I1209 15:09:47.367523 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" event={"ID":"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c","Type":"ContainerStarted","Data":"d22ea32b02dd47949b4042fc5a17ed387fa2a4a6b95762432db2ab51cf5cfd01"} Dec 09 15:09:47 crc kubenswrapper[4555]: I1209 15:09:47.367532 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" event={"ID":"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c","Type":"ContainerStarted","Data":"2b5f8ff420b178d0573c2497d49d15f86d08ea2a89522e450b3fb0cc097958a3"} Dec 09 15:09:47 crc kubenswrapper[4555]: I1209 15:09:47.367539 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" event={"ID":"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c","Type":"ContainerStarted","Data":"75acdaf0f1aa98e656a61f3327245a82ee89d219cb08c777e3f5ca2d6118dd40"} Dec 09 15:09:47 crc kubenswrapper[4555]: I1209 15:09:47.367547 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" event={"ID":"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c","Type":"ContainerStarted","Data":"25b9e07702f291265e6d38ba52d3816e399895df60918e168cf502be53d87592"} Dec 09 15:09:49 crc kubenswrapper[4555]: I1209 15:09:49.387645 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" event={"ID":"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c","Type":"ContainerStarted","Data":"1846670a700446ea326a1ce067fddc373a30aaa1033d96e6fade742125a3c074"} Dec 09 15:09:51 crc kubenswrapper[4555]: I1209 15:09:51.400999 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" event={"ID":"4a47d5ee-c928-4a04-8683-8fd4fe63ea9c","Type":"ContainerStarted","Data":"d13177ff4493eb993c1cd68feeac693e4a914cc30fbe73e82ee780e20c7c6a2d"} Dec 09 15:09:51 crc kubenswrapper[4555]: I1209 15:09:51.401561 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:51 crc kubenswrapper[4555]: I1209 15:09:51.401573 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:51 crc kubenswrapper[4555]: I1209 15:09:51.425836 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" podStartSLOduration=6.4258191700000005 podStartE2EDuration="6.42581917s" podCreationTimestamp="2025-12-09 15:09:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:09:51.424108737 +0000 UTC m=+701.560821456" watchObservedRunningTime="2025-12-09 15:09:51.42581917 +0000 UTC m=+701.562531889" Dec 09 15:09:51 crc kubenswrapper[4555]: I1209 15:09:51.429486 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:51 crc kubenswrapper[4555]: I1209 15:09:51.947275 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc"] Dec 09 15:09:51 crc kubenswrapper[4555]: I1209 15:09:51.947388 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc" Dec 09 15:09:51 crc kubenswrapper[4555]: I1209 15:09:51.947660 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc" Dec 09 15:09:51 crc kubenswrapper[4555]: E1209 15:09:51.965825 4555 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc_openshift-operators_b486a37e-2de3-4639-b189-e69c501f9a6f_0(358ec2f6df0ac09cb4d46a136d177448bbf428f4820256b5cc702c0c88a61bf4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:09:51 crc kubenswrapper[4555]: E1209 15:09:51.965880 4555 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc_openshift-operators_b486a37e-2de3-4639-b189-e69c501f9a6f_0(358ec2f6df0ac09cb4d46a136d177448bbf428f4820256b5cc702c0c88a61bf4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc" Dec 09 15:09:51 crc kubenswrapper[4555]: E1209 15:09:51.965902 4555 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc_openshift-operators_b486a37e-2de3-4639-b189-e69c501f9a6f_0(358ec2f6df0ac09cb4d46a136d177448bbf428f4820256b5cc702c0c88a61bf4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc" Dec 09 15:09:51 crc kubenswrapper[4555]: E1209 15:09:51.965944 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc_openshift-operators(b486a37e-2de3-4639-b189-e69c501f9a6f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc_openshift-operators(b486a37e-2de3-4639-b189-e69c501f9a6f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc_openshift-operators_b486a37e-2de3-4639-b189-e69c501f9a6f_0(358ec2f6df0ac09cb4d46a136d177448bbf428f4820256b5cc702c0c88a61bf4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc" podUID="b486a37e-2de3-4639-b189-e69c501f9a6f" Dec 09 15:09:51 crc kubenswrapper[4555]: I1209 15:09:51.980957 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-v4h96"] Dec 09 15:09:51 crc kubenswrapper[4555]: I1209 15:09:51.981047 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-v4h96" Dec 09 15:09:51 crc kubenswrapper[4555]: I1209 15:09:51.981491 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-v4h96" Dec 09 15:09:51 crc kubenswrapper[4555]: I1209 15:09:51.993654 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-8rffd"] Dec 09 15:09:51 crc kubenswrapper[4555]: I1209 15:09:51.993731 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-8rffd" Dec 09 15:09:51 crc kubenswrapper[4555]: I1209 15:09:51.994029 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-8rffd" Dec 09 15:09:52 crc kubenswrapper[4555]: I1209 15:09:51.997141 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-7l249"] Dec 09 15:09:52 crc kubenswrapper[4555]: I1209 15:09:51.997241 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-7l249" Dec 09 15:09:52 crc kubenswrapper[4555]: I1209 15:09:51.997661 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-7l249" Dec 09 15:09:52 crc kubenswrapper[4555]: I1209 15:09:52.000249 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g"] Dec 09 15:09:52 crc kubenswrapper[4555]: I1209 15:09:52.000348 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g" Dec 09 15:09:52 crc kubenswrapper[4555]: I1209 15:09:52.000679 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g" Dec 09 15:09:52 crc kubenswrapper[4555]: E1209 15:09:52.012830 4555 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-v4h96_openshift-operators_ce4b9ae1-9c56-467a-908e-3b990e17f1cd_0(42e7655bf74c8922b2b8f8fded2e3334d4d0055dbec11ddd390ed7ea727b4b28): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:09:52 crc kubenswrapper[4555]: E1209 15:09:52.012866 4555 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-v4h96_openshift-operators_ce4b9ae1-9c56-467a-908e-3b990e17f1cd_0(42e7655bf74c8922b2b8f8fded2e3334d4d0055dbec11ddd390ed7ea727b4b28): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-v4h96" Dec 09 15:09:52 crc kubenswrapper[4555]: E1209 15:09:52.012885 4555 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-v4h96_openshift-operators_ce4b9ae1-9c56-467a-908e-3b990e17f1cd_0(42e7655bf74c8922b2b8f8fded2e3334d4d0055dbec11ddd390ed7ea727b4b28): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-v4h96" Dec 09 15:09:52 crc kubenswrapper[4555]: E1209 15:09:52.012914 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-v4h96_openshift-operators(ce4b9ae1-9c56-467a-908e-3b990e17f1cd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-v4h96_openshift-operators(ce4b9ae1-9c56-467a-908e-3b990e17f1cd)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-v4h96_openshift-operators_ce4b9ae1-9c56-467a-908e-3b990e17f1cd_0(42e7655bf74c8922b2b8f8fded2e3334d4d0055dbec11ddd390ed7ea727b4b28): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-v4h96" podUID="ce4b9ae1-9c56-467a-908e-3b990e17f1cd" Dec 09 15:09:52 crc kubenswrapper[4555]: E1209 15:09:52.037474 4555 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-8rffd_openshift-operators_7bd718ca-da15-4b6c-89ca-01b62bab1393_0(4b0079bc6098fb69dda83711f8dc19cc2faa9e615c4f315f5e396b0600759490): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:09:52 crc kubenswrapper[4555]: E1209 15:09:52.037529 4555 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-8rffd_openshift-operators_7bd718ca-da15-4b6c-89ca-01b62bab1393_0(4b0079bc6098fb69dda83711f8dc19cc2faa9e615c4f315f5e396b0600759490): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-8rffd" Dec 09 15:09:52 crc kubenswrapper[4555]: E1209 15:09:52.037552 4555 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-8rffd_openshift-operators_7bd718ca-da15-4b6c-89ca-01b62bab1393_0(4b0079bc6098fb69dda83711f8dc19cc2faa9e615c4f315f5e396b0600759490): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-8rffd" Dec 09 15:09:52 crc kubenswrapper[4555]: E1209 15:09:52.037596 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-8rffd_openshift-operators(7bd718ca-da15-4b6c-89ca-01b62bab1393)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-8rffd_openshift-operators(7bd718ca-da15-4b6c-89ca-01b62bab1393)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-8rffd_openshift-operators_7bd718ca-da15-4b6c-89ca-01b62bab1393_0(4b0079bc6098fb69dda83711f8dc19cc2faa9e615c4f315f5e396b0600759490): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-8rffd" podUID="7bd718ca-da15-4b6c-89ca-01b62bab1393" Dec 09 15:09:52 crc kubenswrapper[4555]: E1209 15:09:52.043247 4555 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g_openshift-operators_42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9_0(3e407d5cb30b46998d61525d1e0bf9c65e80f6386efbc6eb4c7417a41c310020): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:09:52 crc kubenswrapper[4555]: E1209 15:09:52.043289 4555 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g_openshift-operators_42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9_0(3e407d5cb30b46998d61525d1e0bf9c65e80f6386efbc6eb4c7417a41c310020): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g" Dec 09 15:09:52 crc kubenswrapper[4555]: E1209 15:09:52.043320 4555 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g_openshift-operators_42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9_0(3e407d5cb30b46998d61525d1e0bf9c65e80f6386efbc6eb4c7417a41c310020): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g" Dec 09 15:09:52 crc kubenswrapper[4555]: E1209 15:09:52.043352 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g_openshift-operators(42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g_openshift-operators(42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g_openshift-operators_42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9_0(3e407d5cb30b46998d61525d1e0bf9c65e80f6386efbc6eb4c7417a41c310020): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g" podUID="42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9" Dec 09 15:09:52 crc kubenswrapper[4555]: E1209 15:09:52.052217 4555 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-7l249_openshift-operators_2302fd31-2414-419b-9733-4d69ef2d00fe_0(617091f809506a41ad4cc717fd94b900f790c1ce1b2b20e54157e396391b09c0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:09:52 crc kubenswrapper[4555]: E1209 15:09:52.052265 4555 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-7l249_openshift-operators_2302fd31-2414-419b-9733-4d69ef2d00fe_0(617091f809506a41ad4cc717fd94b900f790c1ce1b2b20e54157e396391b09c0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-7l249" Dec 09 15:09:52 crc kubenswrapper[4555]: E1209 15:09:52.052283 4555 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-7l249_openshift-operators_2302fd31-2414-419b-9733-4d69ef2d00fe_0(617091f809506a41ad4cc717fd94b900f790c1ce1b2b20e54157e396391b09c0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-7l249" Dec 09 15:09:52 crc kubenswrapper[4555]: E1209 15:09:52.052338 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-7l249_openshift-operators(2302fd31-2414-419b-9733-4d69ef2d00fe)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-7l249_openshift-operators(2302fd31-2414-419b-9733-4d69ef2d00fe)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-7l249_openshift-operators_2302fd31-2414-419b-9733-4d69ef2d00fe_0(617091f809506a41ad4cc717fd94b900f790c1ce1b2b20e54157e396391b09c0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-7l249" podUID="2302fd31-2414-419b-9733-4d69ef2d00fe" Dec 09 15:09:52 crc kubenswrapper[4555]: I1209 15:09:52.405774 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:52 crc kubenswrapper[4555]: I1209 15:09:52.433437 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:09:55 crc kubenswrapper[4555]: I1209 15:09:55.531029 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:09:55 crc kubenswrapper[4555]: I1209 15:09:55.531277 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:09:57 crc kubenswrapper[4555]: I1209 15:09:57.161036 4555 scope.go:117] "RemoveContainer" containerID="fddfb072bd733f5c700d405bf80dcbbe045961e870311c7880dc5d639fdb0059" Dec 09 15:09:57 crc kubenswrapper[4555]: E1209 15:09:57.161247 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-gjjcm_openshift-multus(3e34542c-29e8-43d3-add3-42aa080e0c8a)\"" pod="openshift-multus/multus-gjjcm" podUID="3e34542c-29e8-43d3-add3-42aa080e0c8a" Dec 09 15:10:03 crc kubenswrapper[4555]: I1209 15:10:03.160844 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-8rffd" Dec 09 15:10:03 crc kubenswrapper[4555]: I1209 15:10:03.161754 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-8rffd" Dec 09 15:10:03 crc kubenswrapper[4555]: E1209 15:10:03.183969 4555 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-8rffd_openshift-operators_7bd718ca-da15-4b6c-89ca-01b62bab1393_0(40dc32ef86656f305f20ee3cf47c2b2330f937d82b134d4040d0f2cbc54b1aa2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:10:03 crc kubenswrapper[4555]: E1209 15:10:03.184026 4555 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-8rffd_openshift-operators_7bd718ca-da15-4b6c-89ca-01b62bab1393_0(40dc32ef86656f305f20ee3cf47c2b2330f937d82b134d4040d0f2cbc54b1aa2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-8rffd" Dec 09 15:10:03 crc kubenswrapper[4555]: E1209 15:10:03.184046 4555 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-8rffd_openshift-operators_7bd718ca-da15-4b6c-89ca-01b62bab1393_0(40dc32ef86656f305f20ee3cf47c2b2330f937d82b134d4040d0f2cbc54b1aa2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-8rffd" Dec 09 15:10:03 crc kubenswrapper[4555]: E1209 15:10:03.184089 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-8rffd_openshift-operators(7bd718ca-da15-4b6c-89ca-01b62bab1393)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-8rffd_openshift-operators(7bd718ca-da15-4b6c-89ca-01b62bab1393)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-8rffd_openshift-operators_7bd718ca-da15-4b6c-89ca-01b62bab1393_0(40dc32ef86656f305f20ee3cf47c2b2330f937d82b134d4040d0f2cbc54b1aa2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-8rffd" podUID="7bd718ca-da15-4b6c-89ca-01b62bab1393" Dec 09 15:10:05 crc kubenswrapper[4555]: I1209 15:10:05.160889 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-v4h96" Dec 09 15:10:05 crc kubenswrapper[4555]: I1209 15:10:05.161557 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-v4h96" Dec 09 15:10:05 crc kubenswrapper[4555]: E1209 15:10:05.181562 4555 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-v4h96_openshift-operators_ce4b9ae1-9c56-467a-908e-3b990e17f1cd_0(987106fa4898da8add66ec00a313966b73b48159edb977844b50476787879734): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:10:05 crc kubenswrapper[4555]: E1209 15:10:05.181611 4555 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-v4h96_openshift-operators_ce4b9ae1-9c56-467a-908e-3b990e17f1cd_0(987106fa4898da8add66ec00a313966b73b48159edb977844b50476787879734): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-v4h96" Dec 09 15:10:05 crc kubenswrapper[4555]: E1209 15:10:05.181632 4555 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-v4h96_openshift-operators_ce4b9ae1-9c56-467a-908e-3b990e17f1cd_0(987106fa4898da8add66ec00a313966b73b48159edb977844b50476787879734): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-v4h96" Dec 09 15:10:05 crc kubenswrapper[4555]: E1209 15:10:05.181664 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-v4h96_openshift-operators(ce4b9ae1-9c56-467a-908e-3b990e17f1cd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-v4h96_openshift-operators(ce4b9ae1-9c56-467a-908e-3b990e17f1cd)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-v4h96_openshift-operators_ce4b9ae1-9c56-467a-908e-3b990e17f1cd_0(987106fa4898da8add66ec00a313966b73b48159edb977844b50476787879734): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-v4h96" podUID="ce4b9ae1-9c56-467a-908e-3b990e17f1cd" Dec 09 15:10:06 crc kubenswrapper[4555]: I1209 15:10:06.160405 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g" Dec 09 15:10:06 crc kubenswrapper[4555]: I1209 15:10:06.160409 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc" Dec 09 15:10:06 crc kubenswrapper[4555]: I1209 15:10:06.160446 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-7l249" Dec 09 15:10:06 crc kubenswrapper[4555]: I1209 15:10:06.161155 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc" Dec 09 15:10:06 crc kubenswrapper[4555]: I1209 15:10:06.161160 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g" Dec 09 15:10:06 crc kubenswrapper[4555]: I1209 15:10:06.161163 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-7l249" Dec 09 15:10:06 crc kubenswrapper[4555]: E1209 15:10:06.207251 4555 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g_openshift-operators_42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9_0(6c707848b18221408eb08fc234750b9a798cf973e8fc3ca61df7a78feca8c68e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:10:06 crc kubenswrapper[4555]: E1209 15:10:06.207315 4555 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g_openshift-operators_42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9_0(6c707848b18221408eb08fc234750b9a798cf973e8fc3ca61df7a78feca8c68e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g" Dec 09 15:10:06 crc kubenswrapper[4555]: E1209 15:10:06.207338 4555 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g_openshift-operators_42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9_0(6c707848b18221408eb08fc234750b9a798cf973e8fc3ca61df7a78feca8c68e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g" Dec 09 15:10:06 crc kubenswrapper[4555]: E1209 15:10:06.207393 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g_openshift-operators(42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g_openshift-operators(42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g_openshift-operators_42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9_0(6c707848b18221408eb08fc234750b9a798cf973e8fc3ca61df7a78feca8c68e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g" podUID="42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9" Dec 09 15:10:06 crc kubenswrapper[4555]: E1209 15:10:06.215328 4555 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-7l249_openshift-operators_2302fd31-2414-419b-9733-4d69ef2d00fe_0(624da893da6948e6c6f4f0c7f65d2af57d64b0a6053a5eb3d61840e808057506): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:10:06 crc kubenswrapper[4555]: E1209 15:10:06.215374 4555 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-7l249_openshift-operators_2302fd31-2414-419b-9733-4d69ef2d00fe_0(624da893da6948e6c6f4f0c7f65d2af57d64b0a6053a5eb3d61840e808057506): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-7l249" Dec 09 15:10:06 crc kubenswrapper[4555]: E1209 15:10:06.215394 4555 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-7l249_openshift-operators_2302fd31-2414-419b-9733-4d69ef2d00fe_0(624da893da6948e6c6f4f0c7f65d2af57d64b0a6053a5eb3d61840e808057506): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-7l249" Dec 09 15:10:06 crc kubenswrapper[4555]: E1209 15:10:06.215439 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-7l249_openshift-operators(2302fd31-2414-419b-9733-4d69ef2d00fe)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-7l249_openshift-operators(2302fd31-2414-419b-9733-4d69ef2d00fe)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-7l249_openshift-operators_2302fd31-2414-419b-9733-4d69ef2d00fe_0(624da893da6948e6c6f4f0c7f65d2af57d64b0a6053a5eb3d61840e808057506): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-7l249" podUID="2302fd31-2414-419b-9733-4d69ef2d00fe" Dec 09 15:10:06 crc kubenswrapper[4555]: E1209 15:10:06.222419 4555 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc_openshift-operators_b486a37e-2de3-4639-b189-e69c501f9a6f_0(60a1579959a8a0c4c07b649122a3581ec0499e3561402209654f90eae153f03f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:10:06 crc kubenswrapper[4555]: E1209 15:10:06.222496 4555 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc_openshift-operators_b486a37e-2de3-4639-b189-e69c501f9a6f_0(60a1579959a8a0c4c07b649122a3581ec0499e3561402209654f90eae153f03f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc" Dec 09 15:10:06 crc kubenswrapper[4555]: E1209 15:10:06.222518 4555 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc_openshift-operators_b486a37e-2de3-4639-b189-e69c501f9a6f_0(60a1579959a8a0c4c07b649122a3581ec0499e3561402209654f90eae153f03f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc" Dec 09 15:10:06 crc kubenswrapper[4555]: E1209 15:10:06.222560 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc_openshift-operators(b486a37e-2de3-4639-b189-e69c501f9a6f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc_openshift-operators(b486a37e-2de3-4639-b189-e69c501f9a6f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc_openshift-operators_b486a37e-2de3-4639-b189-e69c501f9a6f_0(60a1579959a8a0c4c07b649122a3581ec0499e3561402209654f90eae153f03f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc" podUID="b486a37e-2de3-4639-b189-e69c501f9a6f" Dec 09 15:10:08 crc kubenswrapper[4555]: I1209 15:10:08.160797 4555 scope.go:117] "RemoveContainer" containerID="fddfb072bd733f5c700d405bf80dcbbe045961e870311c7880dc5d639fdb0059" Dec 09 15:10:08 crc kubenswrapper[4555]: I1209 15:10:08.502329 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gjjcm_3e34542c-29e8-43d3-add3-42aa080e0c8a/kube-multus/2.log" Dec 09 15:10:08 crc kubenswrapper[4555]: I1209 15:10:08.502531 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gjjcm" event={"ID":"3e34542c-29e8-43d3-add3-42aa080e0c8a","Type":"ContainerStarted","Data":"eab9e4ed610af9edf4b51f9766f1b9c608321f615d779455b0fff854ea65f48a"} Dec 09 15:10:14 crc kubenswrapper[4555]: I1209 15:10:14.160805 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-8rffd" Dec 09 15:10:14 crc kubenswrapper[4555]: I1209 15:10:14.161335 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-8rffd" Dec 09 15:10:14 crc kubenswrapper[4555]: I1209 15:10:14.489860 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-8rffd"] Dec 09 15:10:14 crc kubenswrapper[4555]: W1209 15:10:14.493599 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bd718ca_da15_4b6c_89ca_01b62bab1393.slice/crio-a5ec00631359521cfefff6254b859370e856bb8e6a886943874e44de7d4f5662 WatchSource:0}: Error finding container a5ec00631359521cfefff6254b859370e856bb8e6a886943874e44de7d4f5662: Status 404 returned error can't find the container with id a5ec00631359521cfefff6254b859370e856bb8e6a886943874e44de7d4f5662 Dec 09 15:10:14 crc kubenswrapper[4555]: I1209 15:10:14.527786 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-8rffd" event={"ID":"7bd718ca-da15-4b6c-89ca-01b62bab1393","Type":"ContainerStarted","Data":"a5ec00631359521cfefff6254b859370e856bb8e6a886943874e44de7d4f5662"} Dec 09 15:10:15 crc kubenswrapper[4555]: I1209 15:10:15.491540 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4zlc7" Dec 09 15:10:17 crc kubenswrapper[4555]: I1209 15:10:17.160781 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc" Dec 09 15:10:17 crc kubenswrapper[4555]: I1209 15:10:17.161088 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc" Dec 09 15:10:17 crc kubenswrapper[4555]: I1209 15:10:17.504347 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc"] Dec 09 15:10:18 crc kubenswrapper[4555]: I1209 15:10:18.160637 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-v4h96" Dec 09 15:10:18 crc kubenswrapper[4555]: I1209 15:10:18.161123 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-v4h96" Dec 09 15:10:20 crc kubenswrapper[4555]: I1209 15:10:20.160773 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g" Dec 09 15:10:20 crc kubenswrapper[4555]: I1209 15:10:20.164857 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-7l249" Dec 09 15:10:20 crc kubenswrapper[4555]: I1209 15:10:20.165202 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g" Dec 09 15:10:20 crc kubenswrapper[4555]: I1209 15:10:20.165524 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-7l249" Dec 09 15:10:20 crc kubenswrapper[4555]: I1209 15:10:20.559269 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-8rffd" event={"ID":"7bd718ca-da15-4b6c-89ca-01b62bab1393","Type":"ContainerStarted","Data":"cf9689714d383a2a976c9a384fe151f7721c0658b5c08c0d7e91e07de9e6aa2f"} Dec 09 15:10:20 crc kubenswrapper[4555]: I1209 15:10:20.559606 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-8rffd" Dec 09 15:10:20 crc kubenswrapper[4555]: I1209 15:10:20.560721 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc" event={"ID":"b486a37e-2de3-4639-b189-e69c501f9a6f","Type":"ContainerStarted","Data":"6b8a863b2f464563588b2a2fc68028992251c2c75b34517e361fe47c7d806b1e"} Dec 09 15:10:20 crc kubenswrapper[4555]: I1209 15:10:20.573444 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-8rffd" podStartSLOduration=28.81680163 podStartE2EDuration="34.573426186s" podCreationTimestamp="2025-12-09 15:09:46 +0000 UTC" firstStartedPulling="2025-12-09 15:10:14.495708872 +0000 UTC m=+724.632421590" lastFinishedPulling="2025-12-09 15:10:20.252333428 +0000 UTC m=+730.389046146" observedRunningTime="2025-12-09 15:10:20.573052752 +0000 UTC m=+730.709765470" watchObservedRunningTime="2025-12-09 15:10:20.573426186 +0000 UTC m=+730.710138904" Dec 09 15:10:20 crc kubenswrapper[4555]: I1209 15:10:20.587473 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g"] Dec 09 15:10:20 crc kubenswrapper[4555]: W1209 15:10:20.594900 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42ba541e_2ff1_45bb_85d9_9f4ee87e0bb9.slice/crio-898f2f98a6d344afb8cd8c2a40e66e88e0efca0ae7bc67ea82724128a73be1cc WatchSource:0}: Error finding container 898f2f98a6d344afb8cd8c2a40e66e88e0efca0ae7bc67ea82724128a73be1cc: Status 404 returned error can't find the container with id 898f2f98a6d344afb8cd8c2a40e66e88e0efca0ae7bc67ea82724128a73be1cc Dec 09 15:10:20 crc kubenswrapper[4555]: I1209 15:10:20.623351 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-v4h96"] Dec 09 15:10:20 crc kubenswrapper[4555]: W1209 15:10:20.630006 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce4b9ae1_9c56_467a_908e_3b990e17f1cd.slice/crio-3831c025a7bf7796263499cc7d8fe773d460ff75b7feb70b0d396898767960a1 WatchSource:0}: Error finding container 3831c025a7bf7796263499cc7d8fe773d460ff75b7feb70b0d396898767960a1: Status 404 returned error can't find the container with id 3831c025a7bf7796263499cc7d8fe773d460ff75b7feb70b0d396898767960a1 Dec 09 15:10:20 crc kubenswrapper[4555]: I1209 15:10:20.642920 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-7l249"] Dec 09 15:10:20 crc kubenswrapper[4555]: W1209 15:10:20.648848 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2302fd31_2414_419b_9733_4d69ef2d00fe.slice/crio-e421a6de59a5a36f5f08e69ff9bbae3ecb032c739838bdfcf99e6f7bf5995ab7 WatchSource:0}: Error finding container e421a6de59a5a36f5f08e69ff9bbae3ecb032c739838bdfcf99e6f7bf5995ab7: Status 404 returned error can't find the container with id e421a6de59a5a36f5f08e69ff9bbae3ecb032c739838bdfcf99e6f7bf5995ab7 Dec 09 15:10:21 crc kubenswrapper[4555]: I1209 15:10:21.566963 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-7l249" event={"ID":"2302fd31-2414-419b-9733-4d69ef2d00fe","Type":"ContainerStarted","Data":"e421a6de59a5a36f5f08e69ff9bbae3ecb032c739838bdfcf99e6f7bf5995ab7"} Dec 09 15:10:21 crc kubenswrapper[4555]: I1209 15:10:21.569569 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g" event={"ID":"42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9","Type":"ContainerStarted","Data":"898f2f98a6d344afb8cd8c2a40e66e88e0efca0ae7bc67ea82724128a73be1cc"} Dec 09 15:10:21 crc kubenswrapper[4555]: I1209 15:10:21.571218 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-v4h96" event={"ID":"ce4b9ae1-9c56-467a-908e-3b990e17f1cd","Type":"ContainerStarted","Data":"3831c025a7bf7796263499cc7d8fe773d460ff75b7feb70b0d396898767960a1"} Dec 09 15:10:22 crc kubenswrapper[4555]: I1209 15:10:22.587436 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc" event={"ID":"b486a37e-2de3-4639-b189-e69c501f9a6f","Type":"ContainerStarted","Data":"fa87301082ad51a5362c24de87a043a674ad609d852e1f226c5ae4330b7c4844"} Dec 09 15:10:22 crc kubenswrapper[4555]: I1209 15:10:22.589860 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g" event={"ID":"42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9","Type":"ContainerStarted","Data":"3ac80609b85d7b27389b9153294de7dc079edd5f9acfcfd90a5a98fe9daac307"} Dec 09 15:10:22 crc kubenswrapper[4555]: I1209 15:10:22.606968 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc" podStartSLOduration=34.985223796 podStartE2EDuration="36.606946911s" podCreationTimestamp="2025-12-09 15:09:46 +0000 UTC" firstStartedPulling="2025-12-09 15:10:20.195702788 +0000 UTC m=+730.332415507" lastFinishedPulling="2025-12-09 15:10:21.817425904 +0000 UTC m=+731.954138622" observedRunningTime="2025-12-09 15:10:22.604385152 +0000 UTC m=+732.741097870" watchObservedRunningTime="2025-12-09 15:10:22.606946911 +0000 UTC m=+732.743659630" Dec 09 15:10:22 crc kubenswrapper[4555]: I1209 15:10:22.624983 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g" podStartSLOduration=35.400023667 podStartE2EDuration="36.624965264s" podCreationTimestamp="2025-12-09 15:09:46 +0000 UTC" firstStartedPulling="2025-12-09 15:10:20.597073249 +0000 UTC m=+730.733785967" lastFinishedPulling="2025-12-09 15:10:21.822014846 +0000 UTC m=+731.958727564" observedRunningTime="2025-12-09 15:10:22.617849436 +0000 UTC m=+732.754562154" watchObservedRunningTime="2025-12-09 15:10:22.624965264 +0000 UTC m=+732.761677971" Dec 09 15:10:23 crc kubenswrapper[4555]: I1209 15:10:23.595788 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-v4h96" event={"ID":"ce4b9ae1-9c56-467a-908e-3b990e17f1cd","Type":"ContainerStarted","Data":"abdd4e7ecc82f15e7a5ec574e4d0be916c91c5fb740b99c69d385a7c50c91158"} Dec 09 15:10:23 crc kubenswrapper[4555]: I1209 15:10:23.610984 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-v4h96" podStartSLOduration=35.001526543 podStartE2EDuration="37.610970406s" podCreationTimestamp="2025-12-09 15:09:46 +0000 UTC" firstStartedPulling="2025-12-09 15:10:20.632597025 +0000 UTC m=+730.769309743" lastFinishedPulling="2025-12-09 15:10:23.242040888 +0000 UTC m=+733.378753606" observedRunningTime="2025-12-09 15:10:23.608563318 +0000 UTC m=+733.745276036" watchObservedRunningTime="2025-12-09 15:10:23.610970406 +0000 UTC m=+733.747683123" Dec 09 15:10:25 crc kubenswrapper[4555]: I1209 15:10:25.531126 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:10:25 crc kubenswrapper[4555]: I1209 15:10:25.531428 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:10:25 crc kubenswrapper[4555]: I1209 15:10:25.607558 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-7l249" event={"ID":"2302fd31-2414-419b-9733-4d69ef2d00fe","Type":"ContainerStarted","Data":"9dfc4f955fdd9533f7fa59661e3b8c70ba37ab43a940cf22f19d22492dff6052"} Dec 09 15:10:25 crc kubenswrapper[4555]: I1209 15:10:25.607783 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-7l249" Dec 09 15:10:25 crc kubenswrapper[4555]: I1209 15:10:25.609978 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-7l249" Dec 09 15:10:25 crc kubenswrapper[4555]: I1209 15:10:25.622185 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-7l249" podStartSLOduration=34.989673146 podStartE2EDuration="39.622169978s" podCreationTimestamp="2025-12-09 15:09:46 +0000 UTC" firstStartedPulling="2025-12-09 15:10:20.650816605 +0000 UTC m=+730.787529324" lastFinishedPulling="2025-12-09 15:10:25.283313438 +0000 UTC m=+735.420026156" observedRunningTime="2025-12-09 15:10:25.619982503 +0000 UTC m=+735.756695221" watchObservedRunningTime="2025-12-09 15:10:25.622169978 +0000 UTC m=+735.758882685" Dec 09 15:10:27 crc kubenswrapper[4555]: I1209 15:10:27.069043 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-8rffd" Dec 09 15:10:32 crc kubenswrapper[4555]: I1209 15:10:32.696516 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-gkkj9"] Dec 09 15:10:32 crc kubenswrapper[4555]: I1209 15:10:32.697498 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-gkkj9" Dec 09 15:10:32 crc kubenswrapper[4555]: I1209 15:10:32.699578 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 09 15:10:32 crc kubenswrapper[4555]: I1209 15:10:32.699731 4555 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-88m6m" Dec 09 15:10:32 crc kubenswrapper[4555]: I1209 15:10:32.702211 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 09 15:10:32 crc kubenswrapper[4555]: I1209 15:10:32.707387 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-gkkj9"] Dec 09 15:10:32 crc kubenswrapper[4555]: I1209 15:10:32.710518 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-lp6rb"] Dec 09 15:10:32 crc kubenswrapper[4555]: I1209 15:10:32.711087 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-lp6rb" Dec 09 15:10:32 crc kubenswrapper[4555]: I1209 15:10:32.712989 4555 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-ndh4w" Dec 09 15:10:32 crc kubenswrapper[4555]: I1209 15:10:32.716465 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-l4ts5"] Dec 09 15:10:32 crc kubenswrapper[4555]: I1209 15:10:32.717008 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-l4ts5" Dec 09 15:10:32 crc kubenswrapper[4555]: I1209 15:10:32.723698 4555 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-dpf4m" Dec 09 15:10:32 crc kubenswrapper[4555]: I1209 15:10:32.724120 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-lp6rb"] Dec 09 15:10:32 crc kubenswrapper[4555]: I1209 15:10:32.734221 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-l4ts5"] Dec 09 15:10:32 crc kubenswrapper[4555]: I1209 15:10:32.832745 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxqqj\" (UniqueName: \"kubernetes.io/projected/ed698ce4-ff03-4c60-9f16-e9267f8450b1-kube-api-access-zxqqj\") pod \"cert-manager-cainjector-7f985d654d-gkkj9\" (UID: \"ed698ce4-ff03-4c60-9f16-e9267f8450b1\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-gkkj9" Dec 09 15:10:32 crc kubenswrapper[4555]: I1209 15:10:32.832914 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mzk8\" (UniqueName: \"kubernetes.io/projected/4c6fc7d3-7d4a-45cb-bcfd-568d45abba41-kube-api-access-8mzk8\") pod \"cert-manager-webhook-5655c58dd6-l4ts5\" (UID: \"4c6fc7d3-7d4a-45cb-bcfd-568d45abba41\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-l4ts5" Dec 09 15:10:32 crc kubenswrapper[4555]: I1209 15:10:32.832975 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-686fh\" (UniqueName: \"kubernetes.io/projected/c6c72476-b1dc-4dd0-a0af-1dbda507a483-kube-api-access-686fh\") pod \"cert-manager-5b446d88c5-lp6rb\" (UID: \"c6c72476-b1dc-4dd0-a0af-1dbda507a483\") " pod="cert-manager/cert-manager-5b446d88c5-lp6rb" Dec 09 15:10:32 crc kubenswrapper[4555]: I1209 15:10:32.934455 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxqqj\" (UniqueName: \"kubernetes.io/projected/ed698ce4-ff03-4c60-9f16-e9267f8450b1-kube-api-access-zxqqj\") pod \"cert-manager-cainjector-7f985d654d-gkkj9\" (UID: \"ed698ce4-ff03-4c60-9f16-e9267f8450b1\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-gkkj9" Dec 09 15:10:32 crc kubenswrapper[4555]: I1209 15:10:32.934551 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mzk8\" (UniqueName: \"kubernetes.io/projected/4c6fc7d3-7d4a-45cb-bcfd-568d45abba41-kube-api-access-8mzk8\") pod \"cert-manager-webhook-5655c58dd6-l4ts5\" (UID: \"4c6fc7d3-7d4a-45cb-bcfd-568d45abba41\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-l4ts5" Dec 09 15:10:32 crc kubenswrapper[4555]: I1209 15:10:32.934577 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-686fh\" (UniqueName: \"kubernetes.io/projected/c6c72476-b1dc-4dd0-a0af-1dbda507a483-kube-api-access-686fh\") pod \"cert-manager-5b446d88c5-lp6rb\" (UID: \"c6c72476-b1dc-4dd0-a0af-1dbda507a483\") " pod="cert-manager/cert-manager-5b446d88c5-lp6rb" Dec 09 15:10:32 crc kubenswrapper[4555]: I1209 15:10:32.951773 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mzk8\" (UniqueName: \"kubernetes.io/projected/4c6fc7d3-7d4a-45cb-bcfd-568d45abba41-kube-api-access-8mzk8\") pod \"cert-manager-webhook-5655c58dd6-l4ts5\" (UID: \"4c6fc7d3-7d4a-45cb-bcfd-568d45abba41\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-l4ts5" Dec 09 15:10:32 crc kubenswrapper[4555]: I1209 15:10:32.951958 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxqqj\" (UniqueName: \"kubernetes.io/projected/ed698ce4-ff03-4c60-9f16-e9267f8450b1-kube-api-access-zxqqj\") pod \"cert-manager-cainjector-7f985d654d-gkkj9\" (UID: \"ed698ce4-ff03-4c60-9f16-e9267f8450b1\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-gkkj9" Dec 09 15:10:32 crc kubenswrapper[4555]: I1209 15:10:32.952201 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-686fh\" (UniqueName: \"kubernetes.io/projected/c6c72476-b1dc-4dd0-a0af-1dbda507a483-kube-api-access-686fh\") pod \"cert-manager-5b446d88c5-lp6rb\" (UID: \"c6c72476-b1dc-4dd0-a0af-1dbda507a483\") " pod="cert-manager/cert-manager-5b446d88c5-lp6rb" Dec 09 15:10:33 crc kubenswrapper[4555]: I1209 15:10:33.010807 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-gkkj9" Dec 09 15:10:33 crc kubenswrapper[4555]: I1209 15:10:33.024913 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-lp6rb" Dec 09 15:10:33 crc kubenswrapper[4555]: I1209 15:10:33.033138 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-l4ts5" Dec 09 15:10:33 crc kubenswrapper[4555]: I1209 15:10:33.182124 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-gkkj9"] Dec 09 15:10:33 crc kubenswrapper[4555]: I1209 15:10:33.425064 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-lp6rb"] Dec 09 15:10:33 crc kubenswrapper[4555]: I1209 15:10:33.468871 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-l4ts5"] Dec 09 15:10:33 crc kubenswrapper[4555]: I1209 15:10:33.642565 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-lp6rb" event={"ID":"c6c72476-b1dc-4dd0-a0af-1dbda507a483","Type":"ContainerStarted","Data":"8165209a12cc3c4ed63d65b4a3ee8adb5c71179563c110b4cf4a2a2330bedea5"} Dec 09 15:10:33 crc kubenswrapper[4555]: I1209 15:10:33.643632 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-l4ts5" event={"ID":"4c6fc7d3-7d4a-45cb-bcfd-568d45abba41","Type":"ContainerStarted","Data":"dc88ab82a49558b497d8341c79e8a8b94bb2502a9e8274d7046c936eb03fcd99"} Dec 09 15:10:33 crc kubenswrapper[4555]: I1209 15:10:33.644505 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-gkkj9" event={"ID":"ed698ce4-ff03-4c60-9f16-e9267f8450b1","Type":"ContainerStarted","Data":"8037e6e8e95c81057f63e89aa16f431d1481454185f34eb129f39cffd6875216"} Dec 09 15:10:36 crc kubenswrapper[4555]: I1209 15:10:36.661044 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-l4ts5" event={"ID":"4c6fc7d3-7d4a-45cb-bcfd-568d45abba41","Type":"ContainerStarted","Data":"5544ae9e03e010c6dc390ba91fe365b038c50e30e8ba194b6c4ab1d10488ca73"} Dec 09 15:10:36 crc kubenswrapper[4555]: I1209 15:10:36.661418 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-l4ts5" Dec 09 15:10:36 crc kubenswrapper[4555]: I1209 15:10:36.662968 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-gkkj9" event={"ID":"ed698ce4-ff03-4c60-9f16-e9267f8450b1","Type":"ContainerStarted","Data":"56913c660c7a030587f04fcb9f5875d339b4d46f277f566ae14ff7781f8084c9"} Dec 09 15:10:36 crc kubenswrapper[4555]: I1209 15:10:36.664191 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-lp6rb" event={"ID":"c6c72476-b1dc-4dd0-a0af-1dbda507a483","Type":"ContainerStarted","Data":"14e6affb7bf473d0c1781f5eecffb817debfd6c3693a0b686e1e5da931ac7c26"} Dec 09 15:10:36 crc kubenswrapper[4555]: I1209 15:10:36.687461 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-gkkj9" podStartSLOduration=1.8843519720000002 podStartE2EDuration="4.687446187s" podCreationTimestamp="2025-12-09 15:10:32 +0000 UTC" firstStartedPulling="2025-12-09 15:10:33.192744177 +0000 UTC m=+743.329456895" lastFinishedPulling="2025-12-09 15:10:35.995838392 +0000 UTC m=+746.132551110" observedRunningTime="2025-12-09 15:10:36.683191967 +0000 UTC m=+746.819904695" watchObservedRunningTime="2025-12-09 15:10:36.687446187 +0000 UTC m=+746.824158905" Dec 09 15:10:36 crc kubenswrapper[4555]: I1209 15:10:36.687532 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-l4ts5" podStartSLOduration=2.160380899 podStartE2EDuration="4.687528602s" podCreationTimestamp="2025-12-09 15:10:32 +0000 UTC" firstStartedPulling="2025-12-09 15:10:33.476385837 +0000 UTC m=+743.613098555" lastFinishedPulling="2025-12-09 15:10:36.00353354 +0000 UTC m=+746.140246258" observedRunningTime="2025-12-09 15:10:36.674826665 +0000 UTC m=+746.811539383" watchObservedRunningTime="2025-12-09 15:10:36.687528602 +0000 UTC m=+746.824241320" Dec 09 15:10:36 crc kubenswrapper[4555]: I1209 15:10:36.698985 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-lp6rb" podStartSLOduration=2.123085473 podStartE2EDuration="4.698978109s" podCreationTimestamp="2025-12-09 15:10:32 +0000 UTC" firstStartedPulling="2025-12-09 15:10:33.429562437 +0000 UTC m=+743.566275155" lastFinishedPulling="2025-12-09 15:10:36.005455073 +0000 UTC m=+746.142167791" observedRunningTime="2025-12-09 15:10:36.698088091 +0000 UTC m=+746.834800810" watchObservedRunningTime="2025-12-09 15:10:36.698978109 +0000 UTC m=+746.835690827" Dec 09 15:10:37 crc kubenswrapper[4555]: I1209 15:10:37.424297 4555 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 09 15:10:43 crc kubenswrapper[4555]: I1209 15:10:43.035376 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-l4ts5" Dec 09 15:10:55 crc kubenswrapper[4555]: I1209 15:10:55.531294 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:10:55 crc kubenswrapper[4555]: I1209 15:10:55.531526 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:10:55 crc kubenswrapper[4555]: I1209 15:10:55.531558 4555 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" Dec 09 15:10:55 crc kubenswrapper[4555]: I1209 15:10:55.531952 4555 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1349c5c8b355954a96a1c092ad88f43ece6bf94e84e0fb6591265f020c3fe076"} pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 15:10:55 crc kubenswrapper[4555]: I1209 15:10:55.531997 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" containerID="cri-o://1349c5c8b355954a96a1c092ad88f43ece6bf94e84e0fb6591265f020c3fe076" gracePeriod=600 Dec 09 15:10:55 crc kubenswrapper[4555]: I1209 15:10:55.752530 4555 generic.go:334] "Generic (PLEG): container finished" podID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerID="1349c5c8b355954a96a1c092ad88f43ece6bf94e84e0fb6591265f020c3fe076" exitCode=0 Dec 09 15:10:55 crc kubenswrapper[4555]: I1209 15:10:55.752599 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" event={"ID":"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36","Type":"ContainerDied","Data":"1349c5c8b355954a96a1c092ad88f43ece6bf94e84e0fb6591265f020c3fe076"} Dec 09 15:10:55 crc kubenswrapper[4555]: I1209 15:10:55.752744 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" event={"ID":"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36","Type":"ContainerStarted","Data":"fa5fc33cd5d536ecfcb65483f7919b247963e43ee3446fdfb5d114321c1858f9"} Dec 09 15:10:55 crc kubenswrapper[4555]: I1209 15:10:55.752763 4555 scope.go:117] "RemoveContainer" containerID="8dd058b28813c68e71df78106624226378245df5fe593a53aa3d62ba3999ec86" Dec 09 15:11:01 crc kubenswrapper[4555]: I1209 15:11:01.440719 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm"] Dec 09 15:11:01 crc kubenswrapper[4555]: I1209 15:11:01.442161 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm" Dec 09 15:11:01 crc kubenswrapper[4555]: I1209 15:11:01.444624 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 09 15:11:01 crc kubenswrapper[4555]: I1209 15:11:01.451202 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm"] Dec 09 15:11:01 crc kubenswrapper[4555]: I1209 15:11:01.500901 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhls8\" (UniqueName: \"kubernetes.io/projected/98f2d54b-bda4-4632-9bbd-1f138d08c470-kube-api-access-rhls8\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm\" (UID: \"98f2d54b-bda4-4632-9bbd-1f138d08c470\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm" Dec 09 15:11:01 crc kubenswrapper[4555]: I1209 15:11:01.500995 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/98f2d54b-bda4-4632-9bbd-1f138d08c470-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm\" (UID: \"98f2d54b-bda4-4632-9bbd-1f138d08c470\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm" Dec 09 15:11:01 crc kubenswrapper[4555]: I1209 15:11:01.501017 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/98f2d54b-bda4-4632-9bbd-1f138d08c470-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm\" (UID: \"98f2d54b-bda4-4632-9bbd-1f138d08c470\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm" Dec 09 15:11:01 crc kubenswrapper[4555]: I1209 15:11:01.602518 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhls8\" (UniqueName: \"kubernetes.io/projected/98f2d54b-bda4-4632-9bbd-1f138d08c470-kube-api-access-rhls8\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm\" (UID: \"98f2d54b-bda4-4632-9bbd-1f138d08c470\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm" Dec 09 15:11:01 crc kubenswrapper[4555]: I1209 15:11:01.602607 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/98f2d54b-bda4-4632-9bbd-1f138d08c470-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm\" (UID: \"98f2d54b-bda4-4632-9bbd-1f138d08c470\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm" Dec 09 15:11:01 crc kubenswrapper[4555]: I1209 15:11:01.602626 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/98f2d54b-bda4-4632-9bbd-1f138d08c470-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm\" (UID: \"98f2d54b-bda4-4632-9bbd-1f138d08c470\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm" Dec 09 15:11:01 crc kubenswrapper[4555]: I1209 15:11:01.603066 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/98f2d54b-bda4-4632-9bbd-1f138d08c470-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm\" (UID: \"98f2d54b-bda4-4632-9bbd-1f138d08c470\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm" Dec 09 15:11:01 crc kubenswrapper[4555]: I1209 15:11:01.603076 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/98f2d54b-bda4-4632-9bbd-1f138d08c470-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm\" (UID: \"98f2d54b-bda4-4632-9bbd-1f138d08c470\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm" Dec 09 15:11:01 crc kubenswrapper[4555]: I1209 15:11:01.617174 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhls8\" (UniqueName: \"kubernetes.io/projected/98f2d54b-bda4-4632-9bbd-1f138d08c470-kube-api-access-rhls8\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm\" (UID: \"98f2d54b-bda4-4632-9bbd-1f138d08c470\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm" Dec 09 15:11:01 crc kubenswrapper[4555]: I1209 15:11:01.635921 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm"] Dec 09 15:11:01 crc kubenswrapper[4555]: I1209 15:11:01.637060 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm" Dec 09 15:11:01 crc kubenswrapper[4555]: I1209 15:11:01.646416 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm"] Dec 09 15:11:01 crc kubenswrapper[4555]: I1209 15:11:01.703864 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgcgm\" (UniqueName: \"kubernetes.io/projected/b596a9ab-1868-4254-994e-1a99ff17a68a-kube-api-access-cgcgm\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm\" (UID: \"b596a9ab-1868-4254-994e-1a99ff17a68a\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm" Dec 09 15:11:01 crc kubenswrapper[4555]: I1209 15:11:01.703910 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b596a9ab-1868-4254-994e-1a99ff17a68a-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm\" (UID: \"b596a9ab-1868-4254-994e-1a99ff17a68a\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm" Dec 09 15:11:01 crc kubenswrapper[4555]: I1209 15:11:01.704084 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b596a9ab-1868-4254-994e-1a99ff17a68a-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm\" (UID: \"b596a9ab-1868-4254-994e-1a99ff17a68a\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm" Dec 09 15:11:01 crc kubenswrapper[4555]: I1209 15:11:01.754882 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm" Dec 09 15:11:01 crc kubenswrapper[4555]: I1209 15:11:01.805411 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgcgm\" (UniqueName: \"kubernetes.io/projected/b596a9ab-1868-4254-994e-1a99ff17a68a-kube-api-access-cgcgm\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm\" (UID: \"b596a9ab-1868-4254-994e-1a99ff17a68a\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm" Dec 09 15:11:01 crc kubenswrapper[4555]: I1209 15:11:01.805456 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b596a9ab-1868-4254-994e-1a99ff17a68a-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm\" (UID: \"b596a9ab-1868-4254-994e-1a99ff17a68a\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm" Dec 09 15:11:01 crc kubenswrapper[4555]: I1209 15:11:01.805510 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b596a9ab-1868-4254-994e-1a99ff17a68a-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm\" (UID: \"b596a9ab-1868-4254-994e-1a99ff17a68a\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm" Dec 09 15:11:01 crc kubenswrapper[4555]: I1209 15:11:01.805897 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b596a9ab-1868-4254-994e-1a99ff17a68a-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm\" (UID: \"b596a9ab-1868-4254-994e-1a99ff17a68a\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm" Dec 09 15:11:01 crc kubenswrapper[4555]: I1209 15:11:01.805991 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b596a9ab-1868-4254-994e-1a99ff17a68a-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm\" (UID: \"b596a9ab-1868-4254-994e-1a99ff17a68a\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm" Dec 09 15:11:01 crc kubenswrapper[4555]: I1209 15:11:01.824064 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgcgm\" (UniqueName: \"kubernetes.io/projected/b596a9ab-1868-4254-994e-1a99ff17a68a-kube-api-access-cgcgm\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm\" (UID: \"b596a9ab-1868-4254-994e-1a99ff17a68a\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm" Dec 09 15:11:01 crc kubenswrapper[4555]: I1209 15:11:01.950355 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm" Dec 09 15:11:02 crc kubenswrapper[4555]: I1209 15:11:02.107012 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm"] Dec 09 15:11:02 crc kubenswrapper[4555]: I1209 15:11:02.293326 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm"] Dec 09 15:11:02 crc kubenswrapper[4555]: W1209 15:11:02.333748 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb596a9ab_1868_4254_994e_1a99ff17a68a.slice/crio-7d6278057533bc778b5ce44efff810c35ae48c73afa2615fb0c0540130d4fae0 WatchSource:0}: Error finding container 7d6278057533bc778b5ce44efff810c35ae48c73afa2615fb0c0540130d4fae0: Status 404 returned error can't find the container with id 7d6278057533bc778b5ce44efff810c35ae48c73afa2615fb0c0540130d4fae0 Dec 09 15:11:02 crc kubenswrapper[4555]: I1209 15:11:02.792425 4555 generic.go:334] "Generic (PLEG): container finished" podID="b596a9ab-1868-4254-994e-1a99ff17a68a" containerID="07f992e7a075d3eb73cd1afbf068db103c00a8a6cb8a80cef2e485b880d5e64c" exitCode=0 Dec 09 15:11:02 crc kubenswrapper[4555]: I1209 15:11:02.792662 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm" event={"ID":"b596a9ab-1868-4254-994e-1a99ff17a68a","Type":"ContainerDied","Data":"07f992e7a075d3eb73cd1afbf068db103c00a8a6cb8a80cef2e485b880d5e64c"} Dec 09 15:11:02 crc kubenswrapper[4555]: I1209 15:11:02.792706 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm" event={"ID":"b596a9ab-1868-4254-994e-1a99ff17a68a","Type":"ContainerStarted","Data":"7d6278057533bc778b5ce44efff810c35ae48c73afa2615fb0c0540130d4fae0"} Dec 09 15:11:02 crc kubenswrapper[4555]: I1209 15:11:02.797755 4555 generic.go:334] "Generic (PLEG): container finished" podID="98f2d54b-bda4-4632-9bbd-1f138d08c470" containerID="a22bcf04dfabc3b5af2deccb55af82a18a8d692fbe723b012e5cd83d650af593" exitCode=0 Dec 09 15:11:02 crc kubenswrapper[4555]: I1209 15:11:02.797810 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm" event={"ID":"98f2d54b-bda4-4632-9bbd-1f138d08c470","Type":"ContainerDied","Data":"a22bcf04dfabc3b5af2deccb55af82a18a8d692fbe723b012e5cd83d650af593"} Dec 09 15:11:02 crc kubenswrapper[4555]: I1209 15:11:02.797828 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm" event={"ID":"98f2d54b-bda4-4632-9bbd-1f138d08c470","Type":"ContainerStarted","Data":"06d4f218ffd1b540e296f98fe0494087d4cbc1cb9a0524f4abdc0c11a0dacdde"} Dec 09 15:11:04 crc kubenswrapper[4555]: I1209 15:11:04.808692 4555 generic.go:334] "Generic (PLEG): container finished" podID="b596a9ab-1868-4254-994e-1a99ff17a68a" containerID="2a93b52166d89a832752e8ba1223eaef0dbc91dc40e165597d3dff152443b56d" exitCode=0 Dec 09 15:11:04 crc kubenswrapper[4555]: I1209 15:11:04.808900 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm" event={"ID":"b596a9ab-1868-4254-994e-1a99ff17a68a","Type":"ContainerDied","Data":"2a93b52166d89a832752e8ba1223eaef0dbc91dc40e165597d3dff152443b56d"} Dec 09 15:11:04 crc kubenswrapper[4555]: I1209 15:11:04.810857 4555 generic.go:334] "Generic (PLEG): container finished" podID="98f2d54b-bda4-4632-9bbd-1f138d08c470" containerID="4e8e9d8df5c74fd0c514e7bb62360b52d42ab82d337a8b5371a0cc81d0bf15c7" exitCode=0 Dec 09 15:11:04 crc kubenswrapper[4555]: I1209 15:11:04.810895 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm" event={"ID":"98f2d54b-bda4-4632-9bbd-1f138d08c470","Type":"ContainerDied","Data":"4e8e9d8df5c74fd0c514e7bb62360b52d42ab82d337a8b5371a0cc81d0bf15c7"} Dec 09 15:11:05 crc kubenswrapper[4555]: I1209 15:11:05.206604 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t5x6x"] Dec 09 15:11:05 crc kubenswrapper[4555]: I1209 15:11:05.207651 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t5x6x" Dec 09 15:11:05 crc kubenswrapper[4555]: I1209 15:11:05.217741 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t5x6x"] Dec 09 15:11:05 crc kubenswrapper[4555]: I1209 15:11:05.353007 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-428dl\" (UniqueName: \"kubernetes.io/projected/85af6e97-d54a-4ccd-b7dd-e71fdffd20bd-kube-api-access-428dl\") pod \"redhat-operators-t5x6x\" (UID: \"85af6e97-d54a-4ccd-b7dd-e71fdffd20bd\") " pod="openshift-marketplace/redhat-operators-t5x6x" Dec 09 15:11:05 crc kubenswrapper[4555]: I1209 15:11:05.353059 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85af6e97-d54a-4ccd-b7dd-e71fdffd20bd-utilities\") pod \"redhat-operators-t5x6x\" (UID: \"85af6e97-d54a-4ccd-b7dd-e71fdffd20bd\") " pod="openshift-marketplace/redhat-operators-t5x6x" Dec 09 15:11:05 crc kubenswrapper[4555]: I1209 15:11:05.353150 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85af6e97-d54a-4ccd-b7dd-e71fdffd20bd-catalog-content\") pod \"redhat-operators-t5x6x\" (UID: \"85af6e97-d54a-4ccd-b7dd-e71fdffd20bd\") " pod="openshift-marketplace/redhat-operators-t5x6x" Dec 09 15:11:05 crc kubenswrapper[4555]: I1209 15:11:05.454206 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85af6e97-d54a-4ccd-b7dd-e71fdffd20bd-catalog-content\") pod \"redhat-operators-t5x6x\" (UID: \"85af6e97-d54a-4ccd-b7dd-e71fdffd20bd\") " pod="openshift-marketplace/redhat-operators-t5x6x" Dec 09 15:11:05 crc kubenswrapper[4555]: I1209 15:11:05.454288 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-428dl\" (UniqueName: \"kubernetes.io/projected/85af6e97-d54a-4ccd-b7dd-e71fdffd20bd-kube-api-access-428dl\") pod \"redhat-operators-t5x6x\" (UID: \"85af6e97-d54a-4ccd-b7dd-e71fdffd20bd\") " pod="openshift-marketplace/redhat-operators-t5x6x" Dec 09 15:11:05 crc kubenswrapper[4555]: I1209 15:11:05.454349 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85af6e97-d54a-4ccd-b7dd-e71fdffd20bd-utilities\") pod \"redhat-operators-t5x6x\" (UID: \"85af6e97-d54a-4ccd-b7dd-e71fdffd20bd\") " pod="openshift-marketplace/redhat-operators-t5x6x" Dec 09 15:11:05 crc kubenswrapper[4555]: I1209 15:11:05.454689 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85af6e97-d54a-4ccd-b7dd-e71fdffd20bd-catalog-content\") pod \"redhat-operators-t5x6x\" (UID: \"85af6e97-d54a-4ccd-b7dd-e71fdffd20bd\") " pod="openshift-marketplace/redhat-operators-t5x6x" Dec 09 15:11:05 crc kubenswrapper[4555]: I1209 15:11:05.454753 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85af6e97-d54a-4ccd-b7dd-e71fdffd20bd-utilities\") pod \"redhat-operators-t5x6x\" (UID: \"85af6e97-d54a-4ccd-b7dd-e71fdffd20bd\") " pod="openshift-marketplace/redhat-operators-t5x6x" Dec 09 15:11:05 crc kubenswrapper[4555]: I1209 15:11:05.470739 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-428dl\" (UniqueName: \"kubernetes.io/projected/85af6e97-d54a-4ccd-b7dd-e71fdffd20bd-kube-api-access-428dl\") pod \"redhat-operators-t5x6x\" (UID: \"85af6e97-d54a-4ccd-b7dd-e71fdffd20bd\") " pod="openshift-marketplace/redhat-operators-t5x6x" Dec 09 15:11:05 crc kubenswrapper[4555]: I1209 15:11:05.520034 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t5x6x" Dec 09 15:11:05 crc kubenswrapper[4555]: I1209 15:11:05.817195 4555 generic.go:334] "Generic (PLEG): container finished" podID="b596a9ab-1868-4254-994e-1a99ff17a68a" containerID="9a8dba1d8bff58269786ccb1ddd93eb68265855ab540d696b7a8c1fc57e9b116" exitCode=0 Dec 09 15:11:05 crc kubenswrapper[4555]: I1209 15:11:05.817246 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm" event={"ID":"b596a9ab-1868-4254-994e-1a99ff17a68a","Type":"ContainerDied","Data":"9a8dba1d8bff58269786ccb1ddd93eb68265855ab540d696b7a8c1fc57e9b116"} Dec 09 15:11:05 crc kubenswrapper[4555]: I1209 15:11:05.819646 4555 generic.go:334] "Generic (PLEG): container finished" podID="98f2d54b-bda4-4632-9bbd-1f138d08c470" containerID="1ab9273e72883ee94ffa1475e164e78a835eb94e7f7219af44f33e22ee5e8232" exitCode=0 Dec 09 15:11:05 crc kubenswrapper[4555]: I1209 15:11:05.819677 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm" event={"ID":"98f2d54b-bda4-4632-9bbd-1f138d08c470","Type":"ContainerDied","Data":"1ab9273e72883ee94ffa1475e164e78a835eb94e7f7219af44f33e22ee5e8232"} Dec 09 15:11:05 crc kubenswrapper[4555]: I1209 15:11:05.874238 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t5x6x"] Dec 09 15:11:05 crc kubenswrapper[4555]: W1209 15:11:05.876264 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85af6e97_d54a_4ccd_b7dd_e71fdffd20bd.slice/crio-ddbfb3dafa02113543c7dd296df72fba644562db5d0dac99fa226d26f6eea76d WatchSource:0}: Error finding container ddbfb3dafa02113543c7dd296df72fba644562db5d0dac99fa226d26f6eea76d: Status 404 returned error can't find the container with id ddbfb3dafa02113543c7dd296df72fba644562db5d0dac99fa226d26f6eea76d Dec 09 15:11:06 crc kubenswrapper[4555]: I1209 15:11:06.825656 4555 generic.go:334] "Generic (PLEG): container finished" podID="85af6e97-d54a-4ccd-b7dd-e71fdffd20bd" containerID="29cedf67264146d795fb26eb03a9366a3f388c53ffb0d89e2fadd180dc5c82c7" exitCode=0 Dec 09 15:11:06 crc kubenswrapper[4555]: I1209 15:11:06.825697 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5x6x" event={"ID":"85af6e97-d54a-4ccd-b7dd-e71fdffd20bd","Type":"ContainerDied","Data":"29cedf67264146d795fb26eb03a9366a3f388c53ffb0d89e2fadd180dc5c82c7"} Dec 09 15:11:06 crc kubenswrapper[4555]: I1209 15:11:06.826105 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5x6x" event={"ID":"85af6e97-d54a-4ccd-b7dd-e71fdffd20bd","Type":"ContainerStarted","Data":"ddbfb3dafa02113543c7dd296df72fba644562db5d0dac99fa226d26f6eea76d"} Dec 09 15:11:07 crc kubenswrapper[4555]: I1209 15:11:07.018143 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm" Dec 09 15:11:07 crc kubenswrapper[4555]: I1209 15:11:07.057261 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm" Dec 09 15:11:07 crc kubenswrapper[4555]: I1209 15:11:07.172216 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b596a9ab-1868-4254-994e-1a99ff17a68a-util\") pod \"b596a9ab-1868-4254-994e-1a99ff17a68a\" (UID: \"b596a9ab-1868-4254-994e-1a99ff17a68a\") " Dec 09 15:11:07 crc kubenswrapper[4555]: I1209 15:11:07.172277 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/98f2d54b-bda4-4632-9bbd-1f138d08c470-bundle\") pod \"98f2d54b-bda4-4632-9bbd-1f138d08c470\" (UID: \"98f2d54b-bda4-4632-9bbd-1f138d08c470\") " Dec 09 15:11:07 crc kubenswrapper[4555]: I1209 15:11:07.172326 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhls8\" (UniqueName: \"kubernetes.io/projected/98f2d54b-bda4-4632-9bbd-1f138d08c470-kube-api-access-rhls8\") pod \"98f2d54b-bda4-4632-9bbd-1f138d08c470\" (UID: \"98f2d54b-bda4-4632-9bbd-1f138d08c470\") " Dec 09 15:11:07 crc kubenswrapper[4555]: I1209 15:11:07.172344 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgcgm\" (UniqueName: \"kubernetes.io/projected/b596a9ab-1868-4254-994e-1a99ff17a68a-kube-api-access-cgcgm\") pod \"b596a9ab-1868-4254-994e-1a99ff17a68a\" (UID: \"b596a9ab-1868-4254-994e-1a99ff17a68a\") " Dec 09 15:11:07 crc kubenswrapper[4555]: I1209 15:11:07.172393 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b596a9ab-1868-4254-994e-1a99ff17a68a-bundle\") pod \"b596a9ab-1868-4254-994e-1a99ff17a68a\" (UID: \"b596a9ab-1868-4254-994e-1a99ff17a68a\") " Dec 09 15:11:07 crc kubenswrapper[4555]: I1209 15:11:07.172418 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/98f2d54b-bda4-4632-9bbd-1f138d08c470-util\") pod \"98f2d54b-bda4-4632-9bbd-1f138d08c470\" (UID: \"98f2d54b-bda4-4632-9bbd-1f138d08c470\") " Dec 09 15:11:07 crc kubenswrapper[4555]: I1209 15:11:07.173514 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98f2d54b-bda4-4632-9bbd-1f138d08c470-bundle" (OuterVolumeSpecName: "bundle") pod "98f2d54b-bda4-4632-9bbd-1f138d08c470" (UID: "98f2d54b-bda4-4632-9bbd-1f138d08c470"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:11:07 crc kubenswrapper[4555]: I1209 15:11:07.173592 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b596a9ab-1868-4254-994e-1a99ff17a68a-bundle" (OuterVolumeSpecName: "bundle") pod "b596a9ab-1868-4254-994e-1a99ff17a68a" (UID: "b596a9ab-1868-4254-994e-1a99ff17a68a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:11:07 crc kubenswrapper[4555]: I1209 15:11:07.176452 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98f2d54b-bda4-4632-9bbd-1f138d08c470-kube-api-access-rhls8" (OuterVolumeSpecName: "kube-api-access-rhls8") pod "98f2d54b-bda4-4632-9bbd-1f138d08c470" (UID: "98f2d54b-bda4-4632-9bbd-1f138d08c470"). InnerVolumeSpecName "kube-api-access-rhls8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:11:07 crc kubenswrapper[4555]: I1209 15:11:07.176627 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b596a9ab-1868-4254-994e-1a99ff17a68a-kube-api-access-cgcgm" (OuterVolumeSpecName: "kube-api-access-cgcgm") pod "b596a9ab-1868-4254-994e-1a99ff17a68a" (UID: "b596a9ab-1868-4254-994e-1a99ff17a68a"). InnerVolumeSpecName "kube-api-access-cgcgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:11:07 crc kubenswrapper[4555]: I1209 15:11:07.182040 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b596a9ab-1868-4254-994e-1a99ff17a68a-util" (OuterVolumeSpecName: "util") pod "b596a9ab-1868-4254-994e-1a99ff17a68a" (UID: "b596a9ab-1868-4254-994e-1a99ff17a68a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:11:07 crc kubenswrapper[4555]: I1209 15:11:07.183248 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98f2d54b-bda4-4632-9bbd-1f138d08c470-util" (OuterVolumeSpecName: "util") pod "98f2d54b-bda4-4632-9bbd-1f138d08c470" (UID: "98f2d54b-bda4-4632-9bbd-1f138d08c470"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:11:07 crc kubenswrapper[4555]: I1209 15:11:07.273240 4555 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b596a9ab-1868-4254-994e-1a99ff17a68a-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 15:11:07 crc kubenswrapper[4555]: I1209 15:11:07.273266 4555 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/98f2d54b-bda4-4632-9bbd-1f138d08c470-util\") on node \"crc\" DevicePath \"\"" Dec 09 15:11:07 crc kubenswrapper[4555]: I1209 15:11:07.273275 4555 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b596a9ab-1868-4254-994e-1a99ff17a68a-util\") on node \"crc\" DevicePath \"\"" Dec 09 15:11:07 crc kubenswrapper[4555]: I1209 15:11:07.273282 4555 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/98f2d54b-bda4-4632-9bbd-1f138d08c470-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 15:11:07 crc kubenswrapper[4555]: I1209 15:11:07.273292 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhls8\" (UniqueName: \"kubernetes.io/projected/98f2d54b-bda4-4632-9bbd-1f138d08c470-kube-api-access-rhls8\") on node \"crc\" DevicePath \"\"" Dec 09 15:11:07 crc kubenswrapper[4555]: I1209 15:11:07.273328 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgcgm\" (UniqueName: \"kubernetes.io/projected/b596a9ab-1868-4254-994e-1a99ff17a68a-kube-api-access-cgcgm\") on node \"crc\" DevicePath \"\"" Dec 09 15:11:07 crc kubenswrapper[4555]: I1209 15:11:07.832464 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm" event={"ID":"b596a9ab-1868-4254-994e-1a99ff17a68a","Type":"ContainerDied","Data":"7d6278057533bc778b5ce44efff810c35ae48c73afa2615fb0c0540130d4fae0"} Dec 09 15:11:07 crc kubenswrapper[4555]: I1209 15:11:07.832704 4555 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d6278057533bc778b5ce44efff810c35ae48c73afa2615fb0c0540130d4fae0" Dec 09 15:11:07 crc kubenswrapper[4555]: I1209 15:11:07.832475 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm" Dec 09 15:11:07 crc kubenswrapper[4555]: I1209 15:11:07.834794 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm" Dec 09 15:11:07 crc kubenswrapper[4555]: I1209 15:11:07.834797 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm" event={"ID":"98f2d54b-bda4-4632-9bbd-1f138d08c470","Type":"ContainerDied","Data":"06d4f218ffd1b540e296f98fe0494087d4cbc1cb9a0524f4abdc0c11a0dacdde"} Dec 09 15:11:07 crc kubenswrapper[4555]: I1209 15:11:07.834841 4555 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06d4f218ffd1b540e296f98fe0494087d4cbc1cb9a0524f4abdc0c11a0dacdde" Dec 09 15:11:07 crc kubenswrapper[4555]: I1209 15:11:07.836179 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5x6x" event={"ID":"85af6e97-d54a-4ccd-b7dd-e71fdffd20bd","Type":"ContainerStarted","Data":"df7599215e3817ffbd0a09df09c212e838b73a855858fcf964383c07d5391491"} Dec 09 15:11:08 crc kubenswrapper[4555]: I1209 15:11:08.843093 4555 generic.go:334] "Generic (PLEG): container finished" podID="85af6e97-d54a-4ccd-b7dd-e71fdffd20bd" containerID="df7599215e3817ffbd0a09df09c212e838b73a855858fcf964383c07d5391491" exitCode=0 Dec 09 15:11:08 crc kubenswrapper[4555]: I1209 15:11:08.843132 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5x6x" event={"ID":"85af6e97-d54a-4ccd-b7dd-e71fdffd20bd","Type":"ContainerDied","Data":"df7599215e3817ffbd0a09df09c212e838b73a855858fcf964383c07d5391491"} Dec 09 15:11:09 crc kubenswrapper[4555]: I1209 15:11:09.850479 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5x6x" event={"ID":"85af6e97-d54a-4ccd-b7dd-e71fdffd20bd","Type":"ContainerStarted","Data":"12cb1a69a0a89187d3e3508386cf26fed259a83151ae139752bf60cff102399e"} Dec 09 15:11:09 crc kubenswrapper[4555]: I1209 15:11:09.874017 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t5x6x" podStartSLOduration=2.3967810419999998 podStartE2EDuration="4.873991707s" podCreationTimestamp="2025-12-09 15:11:05 +0000 UTC" firstStartedPulling="2025-12-09 15:11:06.827382163 +0000 UTC m=+776.964094881" lastFinishedPulling="2025-12-09 15:11:09.304592828 +0000 UTC m=+779.441305546" observedRunningTime="2025-12-09 15:11:09.871129408 +0000 UTC m=+780.007842127" watchObservedRunningTime="2025-12-09 15:11:09.873991707 +0000 UTC m=+780.010704424" Dec 09 15:11:11 crc kubenswrapper[4555]: I1209 15:11:11.763208 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-jxn6k"] Dec 09 15:11:11 crc kubenswrapper[4555]: E1209 15:11:11.763725 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b596a9ab-1868-4254-994e-1a99ff17a68a" containerName="util" Dec 09 15:11:11 crc kubenswrapper[4555]: I1209 15:11:11.763738 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="b596a9ab-1868-4254-994e-1a99ff17a68a" containerName="util" Dec 09 15:11:11 crc kubenswrapper[4555]: E1209 15:11:11.763748 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98f2d54b-bda4-4632-9bbd-1f138d08c470" containerName="extract" Dec 09 15:11:11 crc kubenswrapper[4555]: I1209 15:11:11.763753 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f2d54b-bda4-4632-9bbd-1f138d08c470" containerName="extract" Dec 09 15:11:11 crc kubenswrapper[4555]: E1209 15:11:11.763762 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b596a9ab-1868-4254-994e-1a99ff17a68a" containerName="pull" Dec 09 15:11:11 crc kubenswrapper[4555]: I1209 15:11:11.763768 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="b596a9ab-1868-4254-994e-1a99ff17a68a" containerName="pull" Dec 09 15:11:11 crc kubenswrapper[4555]: E1209 15:11:11.763775 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98f2d54b-bda4-4632-9bbd-1f138d08c470" containerName="pull" Dec 09 15:11:11 crc kubenswrapper[4555]: I1209 15:11:11.763781 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f2d54b-bda4-4632-9bbd-1f138d08c470" containerName="pull" Dec 09 15:11:11 crc kubenswrapper[4555]: E1209 15:11:11.763792 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b596a9ab-1868-4254-994e-1a99ff17a68a" containerName="extract" Dec 09 15:11:11 crc kubenswrapper[4555]: I1209 15:11:11.763796 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="b596a9ab-1868-4254-994e-1a99ff17a68a" containerName="extract" Dec 09 15:11:11 crc kubenswrapper[4555]: E1209 15:11:11.763804 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98f2d54b-bda4-4632-9bbd-1f138d08c470" containerName="util" Dec 09 15:11:11 crc kubenswrapper[4555]: I1209 15:11:11.763819 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f2d54b-bda4-4632-9bbd-1f138d08c470" containerName="util" Dec 09 15:11:11 crc kubenswrapper[4555]: I1209 15:11:11.763930 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="b596a9ab-1868-4254-994e-1a99ff17a68a" containerName="extract" Dec 09 15:11:11 crc kubenswrapper[4555]: I1209 15:11:11.763940 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="98f2d54b-bda4-4632-9bbd-1f138d08c470" containerName="extract" Dec 09 15:11:11 crc kubenswrapper[4555]: I1209 15:11:11.764294 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-ff9846bd-jxn6k" Dec 09 15:11:11 crc kubenswrapper[4555]: I1209 15:11:11.765707 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-d9npk" Dec 09 15:11:11 crc kubenswrapper[4555]: I1209 15:11:11.766251 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Dec 09 15:11:11 crc kubenswrapper[4555]: I1209 15:11:11.766991 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Dec 09 15:11:11 crc kubenswrapper[4555]: I1209 15:11:11.773979 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-jxn6k"] Dec 09 15:11:11 crc kubenswrapper[4555]: I1209 15:11:11.929746 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8wrh\" (UniqueName: \"kubernetes.io/projected/193bdd06-24b0-4f99-9a9e-bdeeac71dd62-kube-api-access-c8wrh\") pod \"cluster-logging-operator-ff9846bd-jxn6k\" (UID: \"193bdd06-24b0-4f99-9a9e-bdeeac71dd62\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-jxn6k" Dec 09 15:11:12 crc kubenswrapper[4555]: I1209 15:11:12.030525 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8wrh\" (UniqueName: \"kubernetes.io/projected/193bdd06-24b0-4f99-9a9e-bdeeac71dd62-kube-api-access-c8wrh\") pod \"cluster-logging-operator-ff9846bd-jxn6k\" (UID: \"193bdd06-24b0-4f99-9a9e-bdeeac71dd62\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-jxn6k" Dec 09 15:11:12 crc kubenswrapper[4555]: I1209 15:11:12.046595 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8wrh\" (UniqueName: \"kubernetes.io/projected/193bdd06-24b0-4f99-9a9e-bdeeac71dd62-kube-api-access-c8wrh\") pod \"cluster-logging-operator-ff9846bd-jxn6k\" (UID: \"193bdd06-24b0-4f99-9a9e-bdeeac71dd62\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-jxn6k" Dec 09 15:11:12 crc kubenswrapper[4555]: I1209 15:11:12.076737 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-ff9846bd-jxn6k" Dec 09 15:11:12 crc kubenswrapper[4555]: I1209 15:11:12.241786 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-jxn6k"] Dec 09 15:11:12 crc kubenswrapper[4555]: I1209 15:11:12.865165 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-ff9846bd-jxn6k" event={"ID":"193bdd06-24b0-4f99-9a9e-bdeeac71dd62","Type":"ContainerStarted","Data":"7185056d4233bf9428504a83b9b171d972ac0ad38b70850a3580752b15d22221"} Dec 09 15:11:15 crc kubenswrapper[4555]: I1209 15:11:15.520522 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t5x6x" Dec 09 15:11:15 crc kubenswrapper[4555]: I1209 15:11:15.520730 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t5x6x" Dec 09 15:11:15 crc kubenswrapper[4555]: I1209 15:11:15.549622 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t5x6x" Dec 09 15:11:15 crc kubenswrapper[4555]: I1209 15:11:15.916973 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t5x6x" Dec 09 15:11:16 crc kubenswrapper[4555]: I1209 15:11:16.889097 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-ff9846bd-jxn6k" event={"ID":"193bdd06-24b0-4f99-9a9e-bdeeac71dd62","Type":"ContainerStarted","Data":"212d033d36c64d5dd769e757a3155fe6e1abd141ffdaa32b217f3603cffba417"} Dec 09 15:11:16 crc kubenswrapper[4555]: I1209 15:11:16.904018 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-ff9846bd-jxn6k" podStartSLOduration=2.137995799 podStartE2EDuration="5.904005434s" podCreationTimestamp="2025-12-09 15:11:11 +0000 UTC" firstStartedPulling="2025-12-09 15:11:12.253046345 +0000 UTC m=+782.389759063" lastFinishedPulling="2025-12-09 15:11:16.01905598 +0000 UTC m=+786.155768698" observedRunningTime="2025-12-09 15:11:16.901586453 +0000 UTC m=+787.038299171" watchObservedRunningTime="2025-12-09 15:11:16.904005434 +0000 UTC m=+787.040718153" Dec 09 15:11:18 crc kubenswrapper[4555]: I1209 15:11:18.602722 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t5x6x"] Dec 09 15:11:18 crc kubenswrapper[4555]: I1209 15:11:18.603066 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t5x6x" podUID="85af6e97-d54a-4ccd-b7dd-e71fdffd20bd" containerName="registry-server" containerID="cri-o://12cb1a69a0a89187d3e3508386cf26fed259a83151ae139752bf60cff102399e" gracePeriod=2 Dec 09 15:11:19 crc kubenswrapper[4555]: I1209 15:11:19.430364 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t5x6x" Dec 09 15:11:19 crc kubenswrapper[4555]: I1209 15:11:19.528123 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85af6e97-d54a-4ccd-b7dd-e71fdffd20bd-utilities\") pod \"85af6e97-d54a-4ccd-b7dd-e71fdffd20bd\" (UID: \"85af6e97-d54a-4ccd-b7dd-e71fdffd20bd\") " Dec 09 15:11:19 crc kubenswrapper[4555]: I1209 15:11:19.528171 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85af6e97-d54a-4ccd-b7dd-e71fdffd20bd-catalog-content\") pod \"85af6e97-d54a-4ccd-b7dd-e71fdffd20bd\" (UID: \"85af6e97-d54a-4ccd-b7dd-e71fdffd20bd\") " Dec 09 15:11:19 crc kubenswrapper[4555]: I1209 15:11:19.528241 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-428dl\" (UniqueName: \"kubernetes.io/projected/85af6e97-d54a-4ccd-b7dd-e71fdffd20bd-kube-api-access-428dl\") pod \"85af6e97-d54a-4ccd-b7dd-e71fdffd20bd\" (UID: \"85af6e97-d54a-4ccd-b7dd-e71fdffd20bd\") " Dec 09 15:11:19 crc kubenswrapper[4555]: I1209 15:11:19.529114 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85af6e97-d54a-4ccd-b7dd-e71fdffd20bd-utilities" (OuterVolumeSpecName: "utilities") pod "85af6e97-d54a-4ccd-b7dd-e71fdffd20bd" (UID: "85af6e97-d54a-4ccd-b7dd-e71fdffd20bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:11:19 crc kubenswrapper[4555]: I1209 15:11:19.536464 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85af6e97-d54a-4ccd-b7dd-e71fdffd20bd-kube-api-access-428dl" (OuterVolumeSpecName: "kube-api-access-428dl") pod "85af6e97-d54a-4ccd-b7dd-e71fdffd20bd" (UID: "85af6e97-d54a-4ccd-b7dd-e71fdffd20bd"). InnerVolumeSpecName "kube-api-access-428dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:11:19 crc kubenswrapper[4555]: I1209 15:11:19.613613 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85af6e97-d54a-4ccd-b7dd-e71fdffd20bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "85af6e97-d54a-4ccd-b7dd-e71fdffd20bd" (UID: "85af6e97-d54a-4ccd-b7dd-e71fdffd20bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:11:19 crc kubenswrapper[4555]: I1209 15:11:19.629754 4555 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85af6e97-d54a-4ccd-b7dd-e71fdffd20bd-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:11:19 crc kubenswrapper[4555]: I1209 15:11:19.629782 4555 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85af6e97-d54a-4ccd-b7dd-e71fdffd20bd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:11:19 crc kubenswrapper[4555]: I1209 15:11:19.629795 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-428dl\" (UniqueName: \"kubernetes.io/projected/85af6e97-d54a-4ccd-b7dd-e71fdffd20bd-kube-api-access-428dl\") on node \"crc\" DevicePath \"\"" Dec 09 15:11:19 crc kubenswrapper[4555]: I1209 15:11:19.904421 4555 generic.go:334] "Generic (PLEG): container finished" podID="85af6e97-d54a-4ccd-b7dd-e71fdffd20bd" containerID="12cb1a69a0a89187d3e3508386cf26fed259a83151ae139752bf60cff102399e" exitCode=0 Dec 09 15:11:19 crc kubenswrapper[4555]: I1209 15:11:19.904463 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5x6x" event={"ID":"85af6e97-d54a-4ccd-b7dd-e71fdffd20bd","Type":"ContainerDied","Data":"12cb1a69a0a89187d3e3508386cf26fed259a83151ae139752bf60cff102399e"} Dec 09 15:11:19 crc kubenswrapper[4555]: I1209 15:11:19.904470 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t5x6x" Dec 09 15:11:19 crc kubenswrapper[4555]: I1209 15:11:19.904488 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5x6x" event={"ID":"85af6e97-d54a-4ccd-b7dd-e71fdffd20bd","Type":"ContainerDied","Data":"ddbfb3dafa02113543c7dd296df72fba644562db5d0dac99fa226d26f6eea76d"} Dec 09 15:11:19 crc kubenswrapper[4555]: I1209 15:11:19.904512 4555 scope.go:117] "RemoveContainer" containerID="12cb1a69a0a89187d3e3508386cf26fed259a83151ae139752bf60cff102399e" Dec 09 15:11:19 crc kubenswrapper[4555]: I1209 15:11:19.918712 4555 scope.go:117] "RemoveContainer" containerID="df7599215e3817ffbd0a09df09c212e838b73a855858fcf964383c07d5391491" Dec 09 15:11:19 crc kubenswrapper[4555]: I1209 15:11:19.925646 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t5x6x"] Dec 09 15:11:19 crc kubenswrapper[4555]: I1209 15:11:19.929359 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t5x6x"] Dec 09 15:11:19 crc kubenswrapper[4555]: I1209 15:11:19.934515 4555 scope.go:117] "RemoveContainer" containerID="29cedf67264146d795fb26eb03a9366a3f388c53ffb0d89e2fadd180dc5c82c7" Dec 09 15:11:19 crc kubenswrapper[4555]: I1209 15:11:19.966184 4555 scope.go:117] "RemoveContainer" containerID="12cb1a69a0a89187d3e3508386cf26fed259a83151ae139752bf60cff102399e" Dec 09 15:11:19 crc kubenswrapper[4555]: E1209 15:11:19.966484 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12cb1a69a0a89187d3e3508386cf26fed259a83151ae139752bf60cff102399e\": container with ID starting with 12cb1a69a0a89187d3e3508386cf26fed259a83151ae139752bf60cff102399e not found: ID does not exist" containerID="12cb1a69a0a89187d3e3508386cf26fed259a83151ae139752bf60cff102399e" Dec 09 15:11:19 crc kubenswrapper[4555]: I1209 15:11:19.966513 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12cb1a69a0a89187d3e3508386cf26fed259a83151ae139752bf60cff102399e"} err="failed to get container status \"12cb1a69a0a89187d3e3508386cf26fed259a83151ae139752bf60cff102399e\": rpc error: code = NotFound desc = could not find container \"12cb1a69a0a89187d3e3508386cf26fed259a83151ae139752bf60cff102399e\": container with ID starting with 12cb1a69a0a89187d3e3508386cf26fed259a83151ae139752bf60cff102399e not found: ID does not exist" Dec 09 15:11:19 crc kubenswrapper[4555]: I1209 15:11:19.966531 4555 scope.go:117] "RemoveContainer" containerID="df7599215e3817ffbd0a09df09c212e838b73a855858fcf964383c07d5391491" Dec 09 15:11:19 crc kubenswrapper[4555]: E1209 15:11:19.967610 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df7599215e3817ffbd0a09df09c212e838b73a855858fcf964383c07d5391491\": container with ID starting with df7599215e3817ffbd0a09df09c212e838b73a855858fcf964383c07d5391491 not found: ID does not exist" containerID="df7599215e3817ffbd0a09df09c212e838b73a855858fcf964383c07d5391491" Dec 09 15:11:19 crc kubenswrapper[4555]: I1209 15:11:19.967667 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df7599215e3817ffbd0a09df09c212e838b73a855858fcf964383c07d5391491"} err="failed to get container status \"df7599215e3817ffbd0a09df09c212e838b73a855858fcf964383c07d5391491\": rpc error: code = NotFound desc = could not find container \"df7599215e3817ffbd0a09df09c212e838b73a855858fcf964383c07d5391491\": container with ID starting with df7599215e3817ffbd0a09df09c212e838b73a855858fcf964383c07d5391491 not found: ID does not exist" Dec 09 15:11:19 crc kubenswrapper[4555]: I1209 15:11:19.967699 4555 scope.go:117] "RemoveContainer" containerID="29cedf67264146d795fb26eb03a9366a3f388c53ffb0d89e2fadd180dc5c82c7" Dec 09 15:11:19 crc kubenswrapper[4555]: E1209 15:11:19.971517 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29cedf67264146d795fb26eb03a9366a3f388c53ffb0d89e2fadd180dc5c82c7\": container with ID starting with 29cedf67264146d795fb26eb03a9366a3f388c53ffb0d89e2fadd180dc5c82c7 not found: ID does not exist" containerID="29cedf67264146d795fb26eb03a9366a3f388c53ffb0d89e2fadd180dc5c82c7" Dec 09 15:11:19 crc kubenswrapper[4555]: I1209 15:11:19.971743 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29cedf67264146d795fb26eb03a9366a3f388c53ffb0d89e2fadd180dc5c82c7"} err="failed to get container status \"29cedf67264146d795fb26eb03a9366a3f388c53ffb0d89e2fadd180dc5c82c7\": rpc error: code = NotFound desc = could not find container \"29cedf67264146d795fb26eb03a9366a3f388c53ffb0d89e2fadd180dc5c82c7\": container with ID starting with 29cedf67264146d795fb26eb03a9366a3f388c53ffb0d89e2fadd180dc5c82c7 not found: ID does not exist" Dec 09 15:11:20 crc kubenswrapper[4555]: I1209 15:11:20.167053 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85af6e97-d54a-4ccd-b7dd-e71fdffd20bd" path="/var/lib/kubelet/pods/85af6e97-d54a-4ccd-b7dd-e71fdffd20bd/volumes" Dec 09 15:11:23 crc kubenswrapper[4555]: I1209 15:11:23.453476 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-8b59b96d9-kp25m"] Dec 09 15:11:23 crc kubenswrapper[4555]: E1209 15:11:23.454336 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85af6e97-d54a-4ccd-b7dd-e71fdffd20bd" containerName="registry-server" Dec 09 15:11:23 crc kubenswrapper[4555]: I1209 15:11:23.454415 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="85af6e97-d54a-4ccd-b7dd-e71fdffd20bd" containerName="registry-server" Dec 09 15:11:23 crc kubenswrapper[4555]: E1209 15:11:23.454489 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85af6e97-d54a-4ccd-b7dd-e71fdffd20bd" containerName="extract-content" Dec 09 15:11:23 crc kubenswrapper[4555]: I1209 15:11:23.454541 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="85af6e97-d54a-4ccd-b7dd-e71fdffd20bd" containerName="extract-content" Dec 09 15:11:23 crc kubenswrapper[4555]: E1209 15:11:23.454604 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85af6e97-d54a-4ccd-b7dd-e71fdffd20bd" containerName="extract-utilities" Dec 09 15:11:23 crc kubenswrapper[4555]: I1209 15:11:23.454645 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="85af6e97-d54a-4ccd-b7dd-e71fdffd20bd" containerName="extract-utilities" Dec 09 15:11:23 crc kubenswrapper[4555]: I1209 15:11:23.454823 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="85af6e97-d54a-4ccd-b7dd-e71fdffd20bd" containerName="registry-server" Dec 09 15:11:23 crc kubenswrapper[4555]: I1209 15:11:23.455475 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-8b59b96d9-kp25m" Dec 09 15:11:23 crc kubenswrapper[4555]: I1209 15:11:23.457647 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Dec 09 15:11:23 crc kubenswrapper[4555]: I1209 15:11:23.457868 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Dec 09 15:11:23 crc kubenswrapper[4555]: I1209 15:11:23.458034 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Dec 09 15:11:23 crc kubenswrapper[4555]: I1209 15:11:23.458139 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Dec 09 15:11:23 crc kubenswrapper[4555]: I1209 15:11:23.458244 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-9xjzf" Dec 09 15:11:23 crc kubenswrapper[4555]: I1209 15:11:23.458263 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Dec 09 15:11:23 crc kubenswrapper[4555]: I1209 15:11:23.467081 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-8b59b96d9-kp25m"] Dec 09 15:11:23 crc kubenswrapper[4555]: I1209 15:11:23.577398 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f6cf7440-1934-4f8f-9e77-22f1d3315527-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-8b59b96d9-kp25m\" (UID: \"f6cf7440-1934-4f8f-9e77-22f1d3315527\") " pod="openshift-operators-redhat/loki-operator-controller-manager-8b59b96d9-kp25m" Dec 09 15:11:23 crc kubenswrapper[4555]: I1209 15:11:23.577446 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/f6cf7440-1934-4f8f-9e77-22f1d3315527-manager-config\") pod \"loki-operator-controller-manager-8b59b96d9-kp25m\" (UID: \"f6cf7440-1934-4f8f-9e77-22f1d3315527\") " pod="openshift-operators-redhat/loki-operator-controller-manager-8b59b96d9-kp25m" Dec 09 15:11:23 crc kubenswrapper[4555]: I1209 15:11:23.577503 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f6cf7440-1934-4f8f-9e77-22f1d3315527-webhook-cert\") pod \"loki-operator-controller-manager-8b59b96d9-kp25m\" (UID: \"f6cf7440-1934-4f8f-9e77-22f1d3315527\") " pod="openshift-operators-redhat/loki-operator-controller-manager-8b59b96d9-kp25m" Dec 09 15:11:23 crc kubenswrapper[4555]: I1209 15:11:23.577520 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9b6t\" (UniqueName: \"kubernetes.io/projected/f6cf7440-1934-4f8f-9e77-22f1d3315527-kube-api-access-z9b6t\") pod \"loki-operator-controller-manager-8b59b96d9-kp25m\" (UID: \"f6cf7440-1934-4f8f-9e77-22f1d3315527\") " pod="openshift-operators-redhat/loki-operator-controller-manager-8b59b96d9-kp25m" Dec 09 15:11:23 crc kubenswrapper[4555]: I1209 15:11:23.577557 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f6cf7440-1934-4f8f-9e77-22f1d3315527-apiservice-cert\") pod \"loki-operator-controller-manager-8b59b96d9-kp25m\" (UID: \"f6cf7440-1934-4f8f-9e77-22f1d3315527\") " pod="openshift-operators-redhat/loki-operator-controller-manager-8b59b96d9-kp25m" Dec 09 15:11:23 crc kubenswrapper[4555]: I1209 15:11:23.678554 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f6cf7440-1934-4f8f-9e77-22f1d3315527-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-8b59b96d9-kp25m\" (UID: \"f6cf7440-1934-4f8f-9e77-22f1d3315527\") " pod="openshift-operators-redhat/loki-operator-controller-manager-8b59b96d9-kp25m" Dec 09 15:11:23 crc kubenswrapper[4555]: I1209 15:11:23.678792 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/f6cf7440-1934-4f8f-9e77-22f1d3315527-manager-config\") pod \"loki-operator-controller-manager-8b59b96d9-kp25m\" (UID: \"f6cf7440-1934-4f8f-9e77-22f1d3315527\") " pod="openshift-operators-redhat/loki-operator-controller-manager-8b59b96d9-kp25m" Dec 09 15:11:23 crc kubenswrapper[4555]: I1209 15:11:23.678901 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f6cf7440-1934-4f8f-9e77-22f1d3315527-webhook-cert\") pod \"loki-operator-controller-manager-8b59b96d9-kp25m\" (UID: \"f6cf7440-1934-4f8f-9e77-22f1d3315527\") " pod="openshift-operators-redhat/loki-operator-controller-manager-8b59b96d9-kp25m" Dec 09 15:11:23 crc kubenswrapper[4555]: I1209 15:11:23.678973 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9b6t\" (UniqueName: \"kubernetes.io/projected/f6cf7440-1934-4f8f-9e77-22f1d3315527-kube-api-access-z9b6t\") pod \"loki-operator-controller-manager-8b59b96d9-kp25m\" (UID: \"f6cf7440-1934-4f8f-9e77-22f1d3315527\") " pod="openshift-operators-redhat/loki-operator-controller-manager-8b59b96d9-kp25m" Dec 09 15:11:23 crc kubenswrapper[4555]: I1209 15:11:23.679051 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f6cf7440-1934-4f8f-9e77-22f1d3315527-apiservice-cert\") pod \"loki-operator-controller-manager-8b59b96d9-kp25m\" (UID: \"f6cf7440-1934-4f8f-9e77-22f1d3315527\") " pod="openshift-operators-redhat/loki-operator-controller-manager-8b59b96d9-kp25m" Dec 09 15:11:23 crc kubenswrapper[4555]: I1209 15:11:23.679538 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/f6cf7440-1934-4f8f-9e77-22f1d3315527-manager-config\") pod \"loki-operator-controller-manager-8b59b96d9-kp25m\" (UID: \"f6cf7440-1934-4f8f-9e77-22f1d3315527\") " pod="openshift-operators-redhat/loki-operator-controller-manager-8b59b96d9-kp25m" Dec 09 15:11:23 crc kubenswrapper[4555]: I1209 15:11:23.683449 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f6cf7440-1934-4f8f-9e77-22f1d3315527-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-8b59b96d9-kp25m\" (UID: \"f6cf7440-1934-4f8f-9e77-22f1d3315527\") " pod="openshift-operators-redhat/loki-operator-controller-manager-8b59b96d9-kp25m" Dec 09 15:11:23 crc kubenswrapper[4555]: I1209 15:11:23.686862 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f6cf7440-1934-4f8f-9e77-22f1d3315527-apiservice-cert\") pod \"loki-operator-controller-manager-8b59b96d9-kp25m\" (UID: \"f6cf7440-1934-4f8f-9e77-22f1d3315527\") " pod="openshift-operators-redhat/loki-operator-controller-manager-8b59b96d9-kp25m" Dec 09 15:11:23 crc kubenswrapper[4555]: I1209 15:11:23.687278 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f6cf7440-1934-4f8f-9e77-22f1d3315527-webhook-cert\") pod \"loki-operator-controller-manager-8b59b96d9-kp25m\" (UID: \"f6cf7440-1934-4f8f-9e77-22f1d3315527\") " pod="openshift-operators-redhat/loki-operator-controller-manager-8b59b96d9-kp25m" Dec 09 15:11:23 crc kubenswrapper[4555]: I1209 15:11:23.691498 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9b6t\" (UniqueName: \"kubernetes.io/projected/f6cf7440-1934-4f8f-9e77-22f1d3315527-kube-api-access-z9b6t\") pod \"loki-operator-controller-manager-8b59b96d9-kp25m\" (UID: \"f6cf7440-1934-4f8f-9e77-22f1d3315527\") " pod="openshift-operators-redhat/loki-operator-controller-manager-8b59b96d9-kp25m" Dec 09 15:11:23 crc kubenswrapper[4555]: I1209 15:11:23.773851 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-8b59b96d9-kp25m" Dec 09 15:11:24 crc kubenswrapper[4555]: I1209 15:11:24.131233 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-8b59b96d9-kp25m"] Dec 09 15:11:24 crc kubenswrapper[4555]: I1209 15:11:24.938811 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-8b59b96d9-kp25m" event={"ID":"f6cf7440-1934-4f8f-9e77-22f1d3315527","Type":"ContainerStarted","Data":"cd3a7f0a4077f38b1912afb6d13be943af6710f87bd7c49f02e91ae53766267c"} Dec 09 15:11:30 crc kubenswrapper[4555]: I1209 15:11:30.970761 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-8b59b96d9-kp25m" event={"ID":"f6cf7440-1934-4f8f-9e77-22f1d3315527","Type":"ContainerStarted","Data":"3c7391ea4607401a4e09eed01395432e625502566aef16d61cdbd4c79a21e7d4"} Dec 09 15:11:35 crc kubenswrapper[4555]: I1209 15:11:35.996022 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-8b59b96d9-kp25m" event={"ID":"f6cf7440-1934-4f8f-9e77-22f1d3315527","Type":"ContainerStarted","Data":"3b47a41a8bb326a3987d94541293a993a0b6c03a6fbea08faef56bf06b46907e"} Dec 09 15:11:35 crc kubenswrapper[4555]: I1209 15:11:35.996401 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-8b59b96d9-kp25m" Dec 09 15:11:35 crc kubenswrapper[4555]: I1209 15:11:35.998631 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-8b59b96d9-kp25m" Dec 09 15:11:36 crc kubenswrapper[4555]: I1209 15:11:36.013401 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-8b59b96d9-kp25m" podStartSLOduration=1.7815434730000002 podStartE2EDuration="13.013386943s" podCreationTimestamp="2025-12-09 15:11:23 +0000 UTC" firstStartedPulling="2025-12-09 15:11:24.136239559 +0000 UTC m=+794.272952276" lastFinishedPulling="2025-12-09 15:11:35.368083027 +0000 UTC m=+805.504795746" observedRunningTime="2025-12-09 15:11:36.008931721 +0000 UTC m=+806.145644439" watchObservedRunningTime="2025-12-09 15:11:36.013386943 +0000 UTC m=+806.150099661" Dec 09 15:11:39 crc kubenswrapper[4555]: I1209 15:11:39.193877 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Dec 09 15:11:39 crc kubenswrapper[4555]: I1209 15:11:39.195146 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Dec 09 15:11:39 crc kubenswrapper[4555]: I1209 15:11:39.196621 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Dec 09 15:11:39 crc kubenswrapper[4555]: I1209 15:11:39.196639 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Dec 09 15:11:39 crc kubenswrapper[4555]: I1209 15:11:39.198421 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Dec 09 15:11:39 crc kubenswrapper[4555]: I1209 15:11:39.313856 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6fb77ced-246e-45fa-99ee-138c4b025057\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6fb77ced-246e-45fa-99ee-138c4b025057\") pod \"minio\" (UID: \"4b71f402-69d8-4599-8ebd-85b2f81d476d\") " pod="minio-dev/minio" Dec 09 15:11:39 crc kubenswrapper[4555]: I1209 15:11:39.313918 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s66k\" (UniqueName: \"kubernetes.io/projected/4b71f402-69d8-4599-8ebd-85b2f81d476d-kube-api-access-8s66k\") pod \"minio\" (UID: \"4b71f402-69d8-4599-8ebd-85b2f81d476d\") " pod="minio-dev/minio" Dec 09 15:11:39 crc kubenswrapper[4555]: I1209 15:11:39.415664 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6fb77ced-246e-45fa-99ee-138c4b025057\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6fb77ced-246e-45fa-99ee-138c4b025057\") pod \"minio\" (UID: \"4b71f402-69d8-4599-8ebd-85b2f81d476d\") " pod="minio-dev/minio" Dec 09 15:11:39 crc kubenswrapper[4555]: I1209 15:11:39.415720 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s66k\" (UniqueName: \"kubernetes.io/projected/4b71f402-69d8-4599-8ebd-85b2f81d476d-kube-api-access-8s66k\") pod \"minio\" (UID: \"4b71f402-69d8-4599-8ebd-85b2f81d476d\") " pod="minio-dev/minio" Dec 09 15:11:39 crc kubenswrapper[4555]: I1209 15:11:39.418048 4555 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 15:11:39 crc kubenswrapper[4555]: I1209 15:11:39.418087 4555 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6fb77ced-246e-45fa-99ee-138c4b025057\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6fb77ced-246e-45fa-99ee-138c4b025057\") pod \"minio\" (UID: \"4b71f402-69d8-4599-8ebd-85b2f81d476d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/66b0875da275452b2aeaed58510585077925c7b731937cc9069572399d8b31ec/globalmount\"" pod="minio-dev/minio" Dec 09 15:11:39 crc kubenswrapper[4555]: I1209 15:11:39.433023 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s66k\" (UniqueName: \"kubernetes.io/projected/4b71f402-69d8-4599-8ebd-85b2f81d476d-kube-api-access-8s66k\") pod \"minio\" (UID: \"4b71f402-69d8-4599-8ebd-85b2f81d476d\") " pod="minio-dev/minio" Dec 09 15:11:39 crc kubenswrapper[4555]: I1209 15:11:39.436502 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6fb77ced-246e-45fa-99ee-138c4b025057\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6fb77ced-246e-45fa-99ee-138c4b025057\") pod \"minio\" (UID: \"4b71f402-69d8-4599-8ebd-85b2f81d476d\") " pod="minio-dev/minio" Dec 09 15:11:39 crc kubenswrapper[4555]: I1209 15:11:39.509644 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Dec 09 15:11:39 crc kubenswrapper[4555]: I1209 15:11:39.847611 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Dec 09 15:11:40 crc kubenswrapper[4555]: I1209 15:11:40.014504 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"4b71f402-69d8-4599-8ebd-85b2f81d476d","Type":"ContainerStarted","Data":"d57715a43b98dfa14d761139e9297cf55a95c56f7b2d8a80936c5821288944e8"} Dec 09 15:11:43 crc kubenswrapper[4555]: I1209 15:11:43.032263 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"4b71f402-69d8-4599-8ebd-85b2f81d476d","Type":"ContainerStarted","Data":"5136f1c989c81e9996d9d5967364079aa6d38eb3828e7248ada277871cb6b528"} Dec 09 15:11:43 crc kubenswrapper[4555]: I1209 15:11:43.044000 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=3.854901871 podStartE2EDuration="6.043986806s" podCreationTimestamp="2025-12-09 15:11:37 +0000 UTC" firstStartedPulling="2025-12-09 15:11:39.853112356 +0000 UTC m=+809.989825075" lastFinishedPulling="2025-12-09 15:11:42.042197291 +0000 UTC m=+812.178910010" observedRunningTime="2025-12-09 15:11:43.043679667 +0000 UTC m=+813.180392385" watchObservedRunningTime="2025-12-09 15:11:43.043986806 +0000 UTC m=+813.180699524" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.243746 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-tfvnb"] Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.245462 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-76cc67bf56-tfvnb" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.247208 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.247680 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.248827 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-xz5kw" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.249065 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.250059 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.283865 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-tfvnb"] Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.386369 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-9ssm2"] Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.387086 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-5895d59bb8-9ssm2" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.388638 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.388973 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.392600 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.394802 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-9ssm2"] Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.420834 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2672ffe-a69b-4127-95b0-077fff7c2dae-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-9ssm2\" (UID: \"f2672ffe-a69b-4127-95b0-077fff7c2dae\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-9ssm2" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.420931 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84677826-310d-4103-8976-179fea4591d4-config\") pod \"logging-loki-distributor-76cc67bf56-tfvnb\" (UID: \"84677826-310d-4103-8976-179fea4591d4\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tfvnb" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.420993 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/84677826-310d-4103-8976-179fea4591d4-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-tfvnb\" (UID: \"84677826-310d-4103-8976-179fea4591d4\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tfvnb" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.421014 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/f2672ffe-a69b-4127-95b0-077fff7c2dae-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-9ssm2\" (UID: \"f2672ffe-a69b-4127-95b0-077fff7c2dae\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-9ssm2" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.421038 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/f2672ffe-a69b-4127-95b0-077fff7c2dae-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-9ssm2\" (UID: \"f2672ffe-a69b-4127-95b0-077fff7c2dae\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-9ssm2" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.421055 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdq9g\" (UniqueName: \"kubernetes.io/projected/84677826-310d-4103-8976-179fea4591d4-kube-api-access-wdq9g\") pod \"logging-loki-distributor-76cc67bf56-tfvnb\" (UID: \"84677826-310d-4103-8976-179fea4591d4\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tfvnb" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.421108 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2672ffe-a69b-4127-95b0-077fff7c2dae-config\") pod \"logging-loki-querier-5895d59bb8-9ssm2\" (UID: \"f2672ffe-a69b-4127-95b0-077fff7c2dae\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-9ssm2" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.421124 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84677826-310d-4103-8976-179fea4591d4-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-tfvnb\" (UID: \"84677826-310d-4103-8976-179fea4591d4\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tfvnb" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.421156 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9jmd\" (UniqueName: \"kubernetes.io/projected/f2672ffe-a69b-4127-95b0-077fff7c2dae-kube-api-access-k9jmd\") pod \"logging-loki-querier-5895d59bb8-9ssm2\" (UID: \"f2672ffe-a69b-4127-95b0-077fff7c2dae\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-9ssm2" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.421186 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/84677826-310d-4103-8976-179fea4591d4-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-tfvnb\" (UID: \"84677826-310d-4103-8976-179fea4591d4\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tfvnb" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.421204 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/f2672ffe-a69b-4127-95b0-077fff7c2dae-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-9ssm2\" (UID: \"f2672ffe-a69b-4127-95b0-077fff7c2dae\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-9ssm2" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.448030 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-9vk9m"] Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.448920 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-9vk9m" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.451595 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.452096 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.466613 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-9vk9m"] Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.522120 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/84677826-310d-4103-8976-179fea4591d4-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-tfvnb\" (UID: \"84677826-310d-4103-8976-179fea4591d4\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tfvnb" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.522418 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/f2672ffe-a69b-4127-95b0-077fff7c2dae-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-9ssm2\" (UID: \"f2672ffe-a69b-4127-95b0-077fff7c2dae\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-9ssm2" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.522509 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/f2672ffe-a69b-4127-95b0-077fff7c2dae-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-9ssm2\" (UID: \"f2672ffe-a69b-4127-95b0-077fff7c2dae\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-9ssm2" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.522586 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdq9g\" (UniqueName: \"kubernetes.io/projected/84677826-310d-4103-8976-179fea4591d4-kube-api-access-wdq9g\") pod \"logging-loki-distributor-76cc67bf56-tfvnb\" (UID: \"84677826-310d-4103-8976-179fea4591d4\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tfvnb" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.522676 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/bbbfd9c8-69ae-4dbb-99b1-1a62073759b3-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-9vk9m\" (UID: \"bbbfd9c8-69ae-4dbb-99b1-1a62073759b3\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-9vk9m" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.522735 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbbfd9c8-69ae-4dbb-99b1-1a62073759b3-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-9vk9m\" (UID: \"bbbfd9c8-69ae-4dbb-99b1-1a62073759b3\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-9vk9m" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.522829 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2672ffe-a69b-4127-95b0-077fff7c2dae-config\") pod \"logging-loki-querier-5895d59bb8-9ssm2\" (UID: \"f2672ffe-a69b-4127-95b0-077fff7c2dae\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-9ssm2" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.522909 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84677826-310d-4103-8976-179fea4591d4-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-tfvnb\" (UID: \"84677826-310d-4103-8976-179fea4591d4\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tfvnb" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.522989 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zf7q\" (UniqueName: \"kubernetes.io/projected/bbbfd9c8-69ae-4dbb-99b1-1a62073759b3-kube-api-access-8zf7q\") pod \"logging-loki-query-frontend-84558f7c9f-9vk9m\" (UID: \"bbbfd9c8-69ae-4dbb-99b1-1a62073759b3\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-9vk9m" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.523054 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9jmd\" (UniqueName: \"kubernetes.io/projected/f2672ffe-a69b-4127-95b0-077fff7c2dae-kube-api-access-k9jmd\") pod \"logging-loki-querier-5895d59bb8-9ssm2\" (UID: \"f2672ffe-a69b-4127-95b0-077fff7c2dae\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-9ssm2" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.523110 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/bbbfd9c8-69ae-4dbb-99b1-1a62073759b3-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-9vk9m\" (UID: \"bbbfd9c8-69ae-4dbb-99b1-1a62073759b3\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-9vk9m" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.523180 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/84677826-310d-4103-8976-179fea4591d4-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-tfvnb\" (UID: \"84677826-310d-4103-8976-179fea4591d4\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tfvnb" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.523245 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbbfd9c8-69ae-4dbb-99b1-1a62073759b3-config\") pod \"logging-loki-query-frontend-84558f7c9f-9vk9m\" (UID: \"bbbfd9c8-69ae-4dbb-99b1-1a62073759b3\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-9vk9m" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.523324 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/f2672ffe-a69b-4127-95b0-077fff7c2dae-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-9ssm2\" (UID: \"f2672ffe-a69b-4127-95b0-077fff7c2dae\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-9ssm2" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.523420 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2672ffe-a69b-4127-95b0-077fff7c2dae-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-9ssm2\" (UID: \"f2672ffe-a69b-4127-95b0-077fff7c2dae\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-9ssm2" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.523506 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84677826-310d-4103-8976-179fea4591d4-config\") pod \"logging-loki-distributor-76cc67bf56-tfvnb\" (UID: \"84677826-310d-4103-8976-179fea4591d4\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tfvnb" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.524441 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84677826-310d-4103-8976-179fea4591d4-config\") pod \"logging-loki-distributor-76cc67bf56-tfvnb\" (UID: \"84677826-310d-4103-8976-179fea4591d4\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tfvnb" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.525207 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2672ffe-a69b-4127-95b0-077fff7c2dae-config\") pod \"logging-loki-querier-5895d59bb8-9ssm2\" (UID: \"f2672ffe-a69b-4127-95b0-077fff7c2dae\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-9ssm2" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.527381 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/f2672ffe-a69b-4127-95b0-077fff7c2dae-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-9ssm2\" (UID: \"f2672ffe-a69b-4127-95b0-077fff7c2dae\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-9ssm2" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.527656 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/f2672ffe-a69b-4127-95b0-077fff7c2dae-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-9ssm2\" (UID: \"f2672ffe-a69b-4127-95b0-077fff7c2dae\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-9ssm2" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.527914 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84677826-310d-4103-8976-179fea4591d4-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-tfvnb\" (UID: \"84677826-310d-4103-8976-179fea4591d4\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tfvnb" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.528959 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2672ffe-a69b-4127-95b0-077fff7c2dae-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-9ssm2\" (UID: \"f2672ffe-a69b-4127-95b0-077fff7c2dae\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-9ssm2" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.530228 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/84677826-310d-4103-8976-179fea4591d4-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-tfvnb\" (UID: \"84677826-310d-4103-8976-179fea4591d4\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tfvnb" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.530959 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/f2672ffe-a69b-4127-95b0-077fff7c2dae-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-9ssm2\" (UID: \"f2672ffe-a69b-4127-95b0-077fff7c2dae\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-9ssm2" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.531866 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-5c557c5777-mvw9s"] Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.541808 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/84677826-310d-4103-8976-179fea4591d4-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-tfvnb\" (UID: \"84677826-310d-4103-8976-179fea4591d4\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tfvnb" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.542608 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.546126 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.546352 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.546570 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.546674 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.546797 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.552886 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdq9g\" (UniqueName: \"kubernetes.io/projected/84677826-310d-4103-8976-179fea4591d4-kube-api-access-wdq9g\") pod \"logging-loki-distributor-76cc67bf56-tfvnb\" (UID: \"84677826-310d-4103-8976-179fea4591d4\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tfvnb" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.556113 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9jmd\" (UniqueName: \"kubernetes.io/projected/f2672ffe-a69b-4127-95b0-077fff7c2dae-kube-api-access-k9jmd\") pod \"logging-loki-querier-5895d59bb8-9ssm2\" (UID: \"f2672ffe-a69b-4127-95b0-077fff7c2dae\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-9ssm2" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.556938 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-5c557c5777-mvw9s"] Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.557648 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-76cc67bf56-tfvnb" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.568289 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-5c557c5777-cgcbj"] Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.569501 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.570746 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-vgglv" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.592070 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-5c557c5777-cgcbj"] Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.625388 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/2e9a80cf-bda4-4c6f-be38-028a906afc09-rbac\") pod \"logging-loki-gateway-5c557c5777-cgcbj\" (UID: \"2e9a80cf-bda4-4c6f-be38-028a906afc09\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.625501 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/bbbfd9c8-69ae-4dbb-99b1-1a62073759b3-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-9vk9m\" (UID: \"bbbfd9c8-69ae-4dbb-99b1-1a62073759b3\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-9vk9m" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.625529 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbbfd9c8-69ae-4dbb-99b1-1a62073759b3-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-9vk9m\" (UID: \"bbbfd9c8-69ae-4dbb-99b1-1a62073759b3\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-9vk9m" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.625556 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/2e9a80cf-bda4-4c6f-be38-028a906afc09-lokistack-gateway\") pod \"logging-loki-gateway-5c557c5777-cgcbj\" (UID: \"2e9a80cf-bda4-4c6f-be38-028a906afc09\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.625577 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/9733ac5a-b591-46d7-ab89-bc55453c8330-tls-secret\") pod \"logging-loki-gateway-5c557c5777-mvw9s\" (UID: \"9733ac5a-b591-46d7-ab89-bc55453c8330\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.625605 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zf7q\" (UniqueName: \"kubernetes.io/projected/bbbfd9c8-69ae-4dbb-99b1-1a62073759b3-kube-api-access-8zf7q\") pod \"logging-loki-query-frontend-84558f7c9f-9vk9m\" (UID: \"bbbfd9c8-69ae-4dbb-99b1-1a62073759b3\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-9vk9m" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.625619 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e9a80cf-bda4-4c6f-be38-028a906afc09-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5c557c5777-cgcbj\" (UID: \"2e9a80cf-bda4-4c6f-be38-028a906afc09\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.625642 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/bbbfd9c8-69ae-4dbb-99b1-1a62073759b3-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-9vk9m\" (UID: \"bbbfd9c8-69ae-4dbb-99b1-1a62073759b3\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-9vk9m" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.625666 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/9733ac5a-b591-46d7-ab89-bc55453c8330-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5c557c5777-mvw9s\" (UID: \"9733ac5a-b591-46d7-ab89-bc55453c8330\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.625685 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/2e9a80cf-bda4-4c6f-be38-028a906afc09-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5c557c5777-cgcbj\" (UID: \"2e9a80cf-bda4-4c6f-be38-028a906afc09\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.625706 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbbfd9c8-69ae-4dbb-99b1-1a62073759b3-config\") pod \"logging-loki-query-frontend-84558f7c9f-9vk9m\" (UID: \"bbbfd9c8-69ae-4dbb-99b1-1a62073759b3\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-9vk9m" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.625741 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/9733ac5a-b591-46d7-ab89-bc55453c8330-rbac\") pod \"logging-loki-gateway-5c557c5777-mvw9s\" (UID: \"9733ac5a-b591-46d7-ab89-bc55453c8330\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.625758 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/9733ac5a-b591-46d7-ab89-bc55453c8330-lokistack-gateway\") pod \"logging-loki-gateway-5c557c5777-mvw9s\" (UID: \"9733ac5a-b591-46d7-ab89-bc55453c8330\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.625788 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9733ac5a-b591-46d7-ab89-bc55453c8330-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5c557c5777-mvw9s\" (UID: \"9733ac5a-b591-46d7-ab89-bc55453c8330\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.626238 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/9733ac5a-b591-46d7-ab89-bc55453c8330-tenants\") pod \"logging-loki-gateway-5c557c5777-mvw9s\" (UID: \"9733ac5a-b591-46d7-ab89-bc55453c8330\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.626311 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/2e9a80cf-bda4-4c6f-be38-028a906afc09-tenants\") pod \"logging-loki-gateway-5c557c5777-cgcbj\" (UID: \"2e9a80cf-bda4-4c6f-be38-028a906afc09\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.626330 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/2e9a80cf-bda4-4c6f-be38-028a906afc09-tls-secret\") pod \"logging-loki-gateway-5c557c5777-cgcbj\" (UID: \"2e9a80cf-bda4-4c6f-be38-028a906afc09\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.626348 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqhsb\" (UniqueName: \"kubernetes.io/projected/9733ac5a-b591-46d7-ab89-bc55453c8330-kube-api-access-jqhsb\") pod \"logging-loki-gateway-5c557c5777-mvw9s\" (UID: \"9733ac5a-b591-46d7-ab89-bc55453c8330\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.626367 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e9a80cf-bda4-4c6f-be38-028a906afc09-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5c557c5777-cgcbj\" (UID: \"2e9a80cf-bda4-4c6f-be38-028a906afc09\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.626373 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbbfd9c8-69ae-4dbb-99b1-1a62073759b3-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-9vk9m\" (UID: \"bbbfd9c8-69ae-4dbb-99b1-1a62073759b3\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-9vk9m" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.626383 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcmx5\" (UniqueName: \"kubernetes.io/projected/2e9a80cf-bda4-4c6f-be38-028a906afc09-kube-api-access-rcmx5\") pod \"logging-loki-gateway-5c557c5777-cgcbj\" (UID: \"2e9a80cf-bda4-4c6f-be38-028a906afc09\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.626448 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9733ac5a-b591-46d7-ab89-bc55453c8330-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5c557c5777-mvw9s\" (UID: \"9733ac5a-b591-46d7-ab89-bc55453c8330\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.626975 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbbfd9c8-69ae-4dbb-99b1-1a62073759b3-config\") pod \"logging-loki-query-frontend-84558f7c9f-9vk9m\" (UID: \"bbbfd9c8-69ae-4dbb-99b1-1a62073759b3\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-9vk9m" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.635807 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/bbbfd9c8-69ae-4dbb-99b1-1a62073759b3-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-9vk9m\" (UID: \"bbbfd9c8-69ae-4dbb-99b1-1a62073759b3\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-9vk9m" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.635910 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/bbbfd9c8-69ae-4dbb-99b1-1a62073759b3-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-9vk9m\" (UID: \"bbbfd9c8-69ae-4dbb-99b1-1a62073759b3\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-9vk9m" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.645447 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zf7q\" (UniqueName: \"kubernetes.io/projected/bbbfd9c8-69ae-4dbb-99b1-1a62073759b3-kube-api-access-8zf7q\") pod \"logging-loki-query-frontend-84558f7c9f-9vk9m\" (UID: \"bbbfd9c8-69ae-4dbb-99b1-1a62073759b3\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-9vk9m" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.701567 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-5895d59bb8-9ssm2" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.728864 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/2e9a80cf-bda4-4c6f-be38-028a906afc09-tenants\") pod \"logging-loki-gateway-5c557c5777-cgcbj\" (UID: \"2e9a80cf-bda4-4c6f-be38-028a906afc09\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.728891 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/2e9a80cf-bda4-4c6f-be38-028a906afc09-tls-secret\") pod \"logging-loki-gateway-5c557c5777-cgcbj\" (UID: \"2e9a80cf-bda4-4c6f-be38-028a906afc09\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.728912 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqhsb\" (UniqueName: \"kubernetes.io/projected/9733ac5a-b591-46d7-ab89-bc55453c8330-kube-api-access-jqhsb\") pod \"logging-loki-gateway-5c557c5777-mvw9s\" (UID: \"9733ac5a-b591-46d7-ab89-bc55453c8330\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.728930 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e9a80cf-bda4-4c6f-be38-028a906afc09-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5c557c5777-cgcbj\" (UID: \"2e9a80cf-bda4-4c6f-be38-028a906afc09\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.728946 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcmx5\" (UniqueName: \"kubernetes.io/projected/2e9a80cf-bda4-4c6f-be38-028a906afc09-kube-api-access-rcmx5\") pod \"logging-loki-gateway-5c557c5777-cgcbj\" (UID: \"2e9a80cf-bda4-4c6f-be38-028a906afc09\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.728989 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9733ac5a-b591-46d7-ab89-bc55453c8330-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5c557c5777-mvw9s\" (UID: \"9733ac5a-b591-46d7-ab89-bc55453c8330\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.729008 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/2e9a80cf-bda4-4c6f-be38-028a906afc09-rbac\") pod \"logging-loki-gateway-5c557c5777-cgcbj\" (UID: \"2e9a80cf-bda4-4c6f-be38-028a906afc09\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.729047 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/2e9a80cf-bda4-4c6f-be38-028a906afc09-lokistack-gateway\") pod \"logging-loki-gateway-5c557c5777-cgcbj\" (UID: \"2e9a80cf-bda4-4c6f-be38-028a906afc09\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.729063 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/9733ac5a-b591-46d7-ab89-bc55453c8330-tls-secret\") pod \"logging-loki-gateway-5c557c5777-mvw9s\" (UID: \"9733ac5a-b591-46d7-ab89-bc55453c8330\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.729087 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e9a80cf-bda4-4c6f-be38-028a906afc09-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5c557c5777-cgcbj\" (UID: \"2e9a80cf-bda4-4c6f-be38-028a906afc09\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.729112 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/9733ac5a-b591-46d7-ab89-bc55453c8330-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5c557c5777-mvw9s\" (UID: \"9733ac5a-b591-46d7-ab89-bc55453c8330\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.729127 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/2e9a80cf-bda4-4c6f-be38-028a906afc09-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5c557c5777-cgcbj\" (UID: \"2e9a80cf-bda4-4c6f-be38-028a906afc09\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.729163 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/9733ac5a-b591-46d7-ab89-bc55453c8330-rbac\") pod \"logging-loki-gateway-5c557c5777-mvw9s\" (UID: \"9733ac5a-b591-46d7-ab89-bc55453c8330\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.729179 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/9733ac5a-b591-46d7-ab89-bc55453c8330-lokistack-gateway\") pod \"logging-loki-gateway-5c557c5777-mvw9s\" (UID: \"9733ac5a-b591-46d7-ab89-bc55453c8330\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.729199 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9733ac5a-b591-46d7-ab89-bc55453c8330-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5c557c5777-mvw9s\" (UID: \"9733ac5a-b591-46d7-ab89-bc55453c8330\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.729236 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/9733ac5a-b591-46d7-ab89-bc55453c8330-tenants\") pod \"logging-loki-gateway-5c557c5777-mvw9s\" (UID: \"9733ac5a-b591-46d7-ab89-bc55453c8330\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.730360 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/2e9a80cf-bda4-4c6f-be38-028a906afc09-lokistack-gateway\") pod \"logging-loki-gateway-5c557c5777-cgcbj\" (UID: \"2e9a80cf-bda4-4c6f-be38-028a906afc09\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.731137 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9733ac5a-b591-46d7-ab89-bc55453c8330-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5c557c5777-mvw9s\" (UID: \"9733ac5a-b591-46d7-ab89-bc55453c8330\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.735905 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e9a80cf-bda4-4c6f-be38-028a906afc09-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5c557c5777-cgcbj\" (UID: \"2e9a80cf-bda4-4c6f-be38-028a906afc09\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.736309 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/2e9a80cf-bda4-4c6f-be38-028a906afc09-rbac\") pod \"logging-loki-gateway-5c557c5777-cgcbj\" (UID: \"2e9a80cf-bda4-4c6f-be38-028a906afc09\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.736875 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/9733ac5a-b591-46d7-ab89-bc55453c8330-tenants\") pod \"logging-loki-gateway-5c557c5777-mvw9s\" (UID: \"9733ac5a-b591-46d7-ab89-bc55453c8330\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.736936 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/9733ac5a-b591-46d7-ab89-bc55453c8330-rbac\") pod \"logging-loki-gateway-5c557c5777-mvw9s\" (UID: \"9733ac5a-b591-46d7-ab89-bc55453c8330\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.737482 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e9a80cf-bda4-4c6f-be38-028a906afc09-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5c557c5777-cgcbj\" (UID: \"2e9a80cf-bda4-4c6f-be38-028a906afc09\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.737534 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/9733ac5a-b591-46d7-ab89-bc55453c8330-lokistack-gateway\") pod \"logging-loki-gateway-5c557c5777-mvw9s\" (UID: \"9733ac5a-b591-46d7-ab89-bc55453c8330\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.738003 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9733ac5a-b591-46d7-ab89-bc55453c8330-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5c557c5777-mvw9s\" (UID: \"9733ac5a-b591-46d7-ab89-bc55453c8330\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.741186 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/9733ac5a-b591-46d7-ab89-bc55453c8330-tls-secret\") pod \"logging-loki-gateway-5c557c5777-mvw9s\" (UID: \"9733ac5a-b591-46d7-ab89-bc55453c8330\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.743082 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/2e9a80cf-bda4-4c6f-be38-028a906afc09-tls-secret\") pod \"logging-loki-gateway-5c557c5777-cgcbj\" (UID: \"2e9a80cf-bda4-4c6f-be38-028a906afc09\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.743445 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/2e9a80cf-bda4-4c6f-be38-028a906afc09-tenants\") pod \"logging-loki-gateway-5c557c5777-cgcbj\" (UID: \"2e9a80cf-bda4-4c6f-be38-028a906afc09\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.748336 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/9733ac5a-b591-46d7-ab89-bc55453c8330-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5c557c5777-mvw9s\" (UID: \"9733ac5a-b591-46d7-ab89-bc55453c8330\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.758822 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcmx5\" (UniqueName: \"kubernetes.io/projected/2e9a80cf-bda4-4c6f-be38-028a906afc09-kube-api-access-rcmx5\") pod \"logging-loki-gateway-5c557c5777-cgcbj\" (UID: \"2e9a80cf-bda4-4c6f-be38-028a906afc09\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.763877 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/2e9a80cf-bda4-4c6f-be38-028a906afc09-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5c557c5777-cgcbj\" (UID: \"2e9a80cf-bda4-4c6f-be38-028a906afc09\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.764204 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-9vk9m" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.773513 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqhsb\" (UniqueName: \"kubernetes.io/projected/9733ac5a-b591-46d7-ab89-bc55453c8330-kube-api-access-jqhsb\") pod \"logging-loki-gateway-5c557c5777-mvw9s\" (UID: \"9733ac5a-b591-46d7-ab89-bc55453c8330\") " pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.921496 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:46 crc kubenswrapper[4555]: I1209 15:11:46.928646 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.065429 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-tfvnb"] Dec 09 15:11:47 crc kubenswrapper[4555]: W1209 15:11:47.072168 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84677826_310d_4103_8976_179fea4591d4.slice/crio-fde802f31f8372f828f5a31ee2a1f0aec733019412a70bcea1c2c11f51d6ef38 WatchSource:0}: Error finding container fde802f31f8372f828f5a31ee2a1f0aec733019412a70bcea1c2c11f51d6ef38: Status 404 returned error can't find the container with id fde802f31f8372f828f5a31ee2a1f0aec733019412a70bcea1c2c11f51d6ef38 Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.168144 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-9vk9m"] Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.272199 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-9ssm2"] Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.331091 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-5c557c5777-mvw9s"] Dec 09 15:11:47 crc kubenswrapper[4555]: W1209 15:11:47.332014 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9733ac5a_b591_46d7_ab89_bc55453c8330.slice/crio-ae001552e10e59dbdcbf2208e8ec988cc338cc617c4fddd08d6bd8a9a5104494 WatchSource:0}: Error finding container ae001552e10e59dbdcbf2208e8ec988cc338cc617c4fddd08d6bd8a9a5104494: Status 404 returned error can't find the container with id ae001552e10e59dbdcbf2208e8ec988cc338cc617c4fddd08d6bd8a9a5104494 Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.386980 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.387998 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.390480 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.390531 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.390670 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-5c557c5777-cgcbj"] Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.397252 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.426556 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.427442 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.429197 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.429430 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.435618 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.487748 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.488537 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.489727 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.489835 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.499104 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.541335 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-cc7ebe19-96f0-42cb-ab1b-3e46f6f397c4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cc7ebe19-96f0-42cb-ab1b-3e46f6f397c4\") pod \"logging-loki-ingester-0\" (UID: \"9f161ea6-77ce-41a8-8a32-92006cadcaa4\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.541368 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/37512310-d9d3-46a8-aba6-7699f4ed2bec-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"37512310-d9d3-46a8-aba6-7699f4ed2bec\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.541396 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c75wx\" (UniqueName: \"kubernetes.io/projected/9f161ea6-77ce-41a8-8a32-92006cadcaa4-kube-api-access-c75wx\") pod \"logging-loki-ingester-0\" (UID: \"9f161ea6-77ce-41a8-8a32-92006cadcaa4\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.541418 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/37512310-d9d3-46a8-aba6-7699f4ed2bec-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"37512310-d9d3-46a8-aba6-7699f4ed2bec\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.541433 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/9f161ea6-77ce-41a8-8a32-92006cadcaa4-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"9f161ea6-77ce-41a8-8a32-92006cadcaa4\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.541453 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6a4efe1a-5783-4378-8550-0620cb8949f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6a4efe1a-5783-4378-8550-0620cb8949f3\") pod \"logging-loki-ingester-0\" (UID: \"9f161ea6-77ce-41a8-8a32-92006cadcaa4\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.541565 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f161ea6-77ce-41a8-8a32-92006cadcaa4-config\") pod \"logging-loki-ingester-0\" (UID: \"9f161ea6-77ce-41a8-8a32-92006cadcaa4\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.542439 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f161ea6-77ce-41a8-8a32-92006cadcaa4-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"9f161ea6-77ce-41a8-8a32-92006cadcaa4\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.542467 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37512310-d9d3-46a8-aba6-7699f4ed2bec-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"37512310-d9d3-46a8-aba6-7699f4ed2bec\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.542517 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/9f161ea6-77ce-41a8-8a32-92006cadcaa4-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"9f161ea6-77ce-41a8-8a32-92006cadcaa4\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.542595 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3a95b59a-a83c-4a4d-9441-72823bfe7d8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3a95b59a-a83c-4a4d-9441-72823bfe7d8f\") pod \"logging-loki-compactor-0\" (UID: \"37512310-d9d3-46a8-aba6-7699f4ed2bec\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.543244 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/9f161ea6-77ce-41a8-8a32-92006cadcaa4-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"9f161ea6-77ce-41a8-8a32-92006cadcaa4\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.543362 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmhkh\" (UniqueName: \"kubernetes.io/projected/37512310-d9d3-46a8-aba6-7699f4ed2bec-kube-api-access-gmhkh\") pod \"logging-loki-compactor-0\" (UID: \"37512310-d9d3-46a8-aba6-7699f4ed2bec\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.543449 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/37512310-d9d3-46a8-aba6-7699f4ed2bec-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"37512310-d9d3-46a8-aba6-7699f4ed2bec\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.543939 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37512310-d9d3-46a8-aba6-7699f4ed2bec-config\") pod \"logging-loki-compactor-0\" (UID: \"37512310-d9d3-46a8-aba6-7699f4ed2bec\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.645773 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/9f161ea6-77ce-41a8-8a32-92006cadcaa4-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"9f161ea6-77ce-41a8-8a32-92006cadcaa4\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.645816 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f161ea6-77ce-41a8-8a32-92006cadcaa4-config\") pod \"logging-loki-ingester-0\" (UID: \"9f161ea6-77ce-41a8-8a32-92006cadcaa4\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.645839 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a95fb43d-d122-4136-8502-780c693b27c5-config\") pod \"logging-loki-index-gateway-0\" (UID: \"a95fb43d-d122-4136-8502-780c693b27c5\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.645867 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f161ea6-77ce-41a8-8a32-92006cadcaa4-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"9f161ea6-77ce-41a8-8a32-92006cadcaa4\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.645892 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3a95b59a-a83c-4a4d-9441-72823bfe7d8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3a95b59a-a83c-4a4d-9441-72823bfe7d8f\") pod \"logging-loki-compactor-0\" (UID: \"37512310-d9d3-46a8-aba6-7699f4ed2bec\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.645916 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-88954142-0bd5-4bc0-8dc6-b61c877aa246\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88954142-0bd5-4bc0-8dc6-b61c877aa246\") pod \"logging-loki-index-gateway-0\" (UID: \"a95fb43d-d122-4136-8502-780c693b27c5\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.645936 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/37512310-d9d3-46a8-aba6-7699f4ed2bec-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"37512310-d9d3-46a8-aba6-7699f4ed2bec\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.645950 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/a95fb43d-d122-4136-8502-780c693b27c5-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"a95fb43d-d122-4136-8502-780c693b27c5\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.645966 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37512310-d9d3-46a8-aba6-7699f4ed2bec-config\") pod \"logging-loki-compactor-0\" (UID: \"37512310-d9d3-46a8-aba6-7699f4ed2bec\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.645991 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-cc7ebe19-96f0-42cb-ab1b-3e46f6f397c4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cc7ebe19-96f0-42cb-ab1b-3e46f6f397c4\") pod \"logging-loki-ingester-0\" (UID: \"9f161ea6-77ce-41a8-8a32-92006cadcaa4\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.646007 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/37512310-d9d3-46a8-aba6-7699f4ed2bec-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"37512310-d9d3-46a8-aba6-7699f4ed2bec\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.646025 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c75wx\" (UniqueName: \"kubernetes.io/projected/9f161ea6-77ce-41a8-8a32-92006cadcaa4-kube-api-access-c75wx\") pod \"logging-loki-ingester-0\" (UID: \"9f161ea6-77ce-41a8-8a32-92006cadcaa4\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.646040 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/37512310-d9d3-46a8-aba6-7699f4ed2bec-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"37512310-d9d3-46a8-aba6-7699f4ed2bec\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.646058 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6a4efe1a-5783-4378-8550-0620cb8949f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6a4efe1a-5783-4378-8550-0620cb8949f3\") pod \"logging-loki-ingester-0\" (UID: \"9f161ea6-77ce-41a8-8a32-92006cadcaa4\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.646081 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37512310-d9d3-46a8-aba6-7699f4ed2bec-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"37512310-d9d3-46a8-aba6-7699f4ed2bec\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.646097 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnj6w\" (UniqueName: \"kubernetes.io/projected/a95fb43d-d122-4136-8502-780c693b27c5-kube-api-access-xnj6w\") pod \"logging-loki-index-gateway-0\" (UID: \"a95fb43d-d122-4136-8502-780c693b27c5\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.646112 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/9f161ea6-77ce-41a8-8a32-92006cadcaa4-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"9f161ea6-77ce-41a8-8a32-92006cadcaa4\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.646125 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a95fb43d-d122-4136-8502-780c693b27c5-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"a95fb43d-d122-4136-8502-780c693b27c5\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.646142 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/a95fb43d-d122-4136-8502-780c693b27c5-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"a95fb43d-d122-4136-8502-780c693b27c5\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.646157 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/9f161ea6-77ce-41a8-8a32-92006cadcaa4-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"9f161ea6-77ce-41a8-8a32-92006cadcaa4\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.646173 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmhkh\" (UniqueName: \"kubernetes.io/projected/37512310-d9d3-46a8-aba6-7699f4ed2bec-kube-api-access-gmhkh\") pod \"logging-loki-compactor-0\" (UID: \"37512310-d9d3-46a8-aba6-7699f4ed2bec\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.646201 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/a95fb43d-d122-4136-8502-780c693b27c5-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"a95fb43d-d122-4136-8502-780c693b27c5\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.647376 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f161ea6-77ce-41a8-8a32-92006cadcaa4-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"9f161ea6-77ce-41a8-8a32-92006cadcaa4\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.647914 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f161ea6-77ce-41a8-8a32-92006cadcaa4-config\") pod \"logging-loki-ingester-0\" (UID: \"9f161ea6-77ce-41a8-8a32-92006cadcaa4\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.648962 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37512310-d9d3-46a8-aba6-7699f4ed2bec-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"37512310-d9d3-46a8-aba6-7699f4ed2bec\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.649513 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37512310-d9d3-46a8-aba6-7699f4ed2bec-config\") pod \"logging-loki-compactor-0\" (UID: \"37512310-d9d3-46a8-aba6-7699f4ed2bec\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.650257 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/37512310-d9d3-46a8-aba6-7699f4ed2bec-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"37512310-d9d3-46a8-aba6-7699f4ed2bec\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.650285 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/37512310-d9d3-46a8-aba6-7699f4ed2bec-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"37512310-d9d3-46a8-aba6-7699f4ed2bec\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.650896 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/9f161ea6-77ce-41a8-8a32-92006cadcaa4-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"9f161ea6-77ce-41a8-8a32-92006cadcaa4\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.650933 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/9f161ea6-77ce-41a8-8a32-92006cadcaa4-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"9f161ea6-77ce-41a8-8a32-92006cadcaa4\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.651197 4555 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.651234 4555 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3a95b59a-a83c-4a4d-9441-72823bfe7d8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3a95b59a-a83c-4a4d-9441-72823bfe7d8f\") pod \"logging-loki-compactor-0\" (UID: \"37512310-d9d3-46a8-aba6-7699f4ed2bec\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f39ee08832268af3dacb0397eeaeb0a72522acb0f650486d9d19d03883ec77f7/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.651560 4555 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.651589 4555 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-cc7ebe19-96f0-42cb-ab1b-3e46f6f397c4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cc7ebe19-96f0-42cb-ab1b-3e46f6f397c4\") pod \"logging-loki-ingester-0\" (UID: \"9f161ea6-77ce-41a8-8a32-92006cadcaa4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cadb5c124ec1ef3b63cd534b18d68a0042f61eb183eda62a259c465f5060a230/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.651757 4555 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.651783 4555 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6a4efe1a-5783-4378-8550-0620cb8949f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6a4efe1a-5783-4378-8550-0620cb8949f3\") pod \"logging-loki-ingester-0\" (UID: \"9f161ea6-77ce-41a8-8a32-92006cadcaa4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/884bdd23f7bfbcd44506ef517b30edc91d398d4c4074448d9f4b39877af69794/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.652002 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/9f161ea6-77ce-41a8-8a32-92006cadcaa4-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"9f161ea6-77ce-41a8-8a32-92006cadcaa4\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.653599 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/37512310-d9d3-46a8-aba6-7699f4ed2bec-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"37512310-d9d3-46a8-aba6-7699f4ed2bec\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.658971 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c75wx\" (UniqueName: \"kubernetes.io/projected/9f161ea6-77ce-41a8-8a32-92006cadcaa4-kube-api-access-c75wx\") pod \"logging-loki-ingester-0\" (UID: \"9f161ea6-77ce-41a8-8a32-92006cadcaa4\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.661323 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmhkh\" (UniqueName: \"kubernetes.io/projected/37512310-d9d3-46a8-aba6-7699f4ed2bec-kube-api-access-gmhkh\") pod \"logging-loki-compactor-0\" (UID: \"37512310-d9d3-46a8-aba6-7699f4ed2bec\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.670943 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-cc7ebe19-96f0-42cb-ab1b-3e46f6f397c4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cc7ebe19-96f0-42cb-ab1b-3e46f6f397c4\") pod \"logging-loki-ingester-0\" (UID: \"9f161ea6-77ce-41a8-8a32-92006cadcaa4\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.671214 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6a4efe1a-5783-4378-8550-0620cb8949f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6a4efe1a-5783-4378-8550-0620cb8949f3\") pod \"logging-loki-ingester-0\" (UID: \"9f161ea6-77ce-41a8-8a32-92006cadcaa4\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.671588 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3a95b59a-a83c-4a4d-9441-72823bfe7d8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3a95b59a-a83c-4a4d-9441-72823bfe7d8f\") pod \"logging-loki-compactor-0\" (UID: \"37512310-d9d3-46a8-aba6-7699f4ed2bec\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.702738 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.747297 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/a95fb43d-d122-4136-8502-780c693b27c5-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"a95fb43d-d122-4136-8502-780c693b27c5\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.747388 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnj6w\" (UniqueName: \"kubernetes.io/projected/a95fb43d-d122-4136-8502-780c693b27c5-kube-api-access-xnj6w\") pod \"logging-loki-index-gateway-0\" (UID: \"a95fb43d-d122-4136-8502-780c693b27c5\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.747411 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a95fb43d-d122-4136-8502-780c693b27c5-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"a95fb43d-d122-4136-8502-780c693b27c5\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.747430 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/a95fb43d-d122-4136-8502-780c693b27c5-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"a95fb43d-d122-4136-8502-780c693b27c5\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.747466 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/a95fb43d-d122-4136-8502-780c693b27c5-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"a95fb43d-d122-4136-8502-780c693b27c5\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.747508 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a95fb43d-d122-4136-8502-780c693b27c5-config\") pod \"logging-loki-index-gateway-0\" (UID: \"a95fb43d-d122-4136-8502-780c693b27c5\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.747542 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-88954142-0bd5-4bc0-8dc6-b61c877aa246\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88954142-0bd5-4bc0-8dc6-b61c877aa246\") pod \"logging-loki-index-gateway-0\" (UID: \"a95fb43d-d122-4136-8502-780c693b27c5\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.749275 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.749666 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a95fb43d-d122-4136-8502-780c693b27c5-config\") pod \"logging-loki-index-gateway-0\" (UID: \"a95fb43d-d122-4136-8502-780c693b27c5\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.749911 4555 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.749959 4555 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-88954142-0bd5-4bc0-8dc6-b61c877aa246\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88954142-0bd5-4bc0-8dc6-b61c877aa246\") pod \"logging-loki-index-gateway-0\" (UID: \"a95fb43d-d122-4136-8502-780c693b27c5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2f3f56498d8f96d517b675c9284f1f08d014a203d50b4e9d8e17abc2d860dbe2/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.750492 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a95fb43d-d122-4136-8502-780c693b27c5-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"a95fb43d-d122-4136-8502-780c693b27c5\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.750909 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/a95fb43d-d122-4136-8502-780c693b27c5-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"a95fb43d-d122-4136-8502-780c693b27c5\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.751551 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/a95fb43d-d122-4136-8502-780c693b27c5-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"a95fb43d-d122-4136-8502-780c693b27c5\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.752477 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/a95fb43d-d122-4136-8502-780c693b27c5-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"a95fb43d-d122-4136-8502-780c693b27c5\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.761991 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnj6w\" (UniqueName: \"kubernetes.io/projected/a95fb43d-d122-4136-8502-780c693b27c5-kube-api-access-xnj6w\") pod \"logging-loki-index-gateway-0\" (UID: \"a95fb43d-d122-4136-8502-780c693b27c5\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.775369 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-88954142-0bd5-4bc0-8dc6-b61c877aa246\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88954142-0bd5-4bc0-8dc6-b61c877aa246\") pod \"logging-loki-index-gateway-0\" (UID: \"a95fb43d-d122-4136-8502-780c693b27c5\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:11:47 crc kubenswrapper[4555]: I1209 15:11:47.800262 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:11:48 crc kubenswrapper[4555]: I1209 15:11:48.045954 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 09 15:11:48 crc kubenswrapper[4555]: W1209 15:11:48.048413 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f161ea6_77ce_41a8_8a32_92006cadcaa4.slice/crio-80f0fb4b7b9cc19f0b7e1005ffec9076d58db2485839bea87c523ba4d331b713 WatchSource:0}: Error finding container 80f0fb4b7b9cc19f0b7e1005ffec9076d58db2485839bea87c523ba4d331b713: Status 404 returned error can't find the container with id 80f0fb4b7b9cc19f0b7e1005ffec9076d58db2485839bea87c523ba4d331b713 Dec 09 15:11:48 crc kubenswrapper[4555]: I1209 15:11:48.060542 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-5895d59bb8-9ssm2" event={"ID":"f2672ffe-a69b-4127-95b0-077fff7c2dae","Type":"ContainerStarted","Data":"9f27c07ffadd1284dc7916c44074d9b5ef9a3db1b5a7c992191989cb1afb1997"} Dec 09 15:11:48 crc kubenswrapper[4555]: I1209 15:11:48.061624 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-76cc67bf56-tfvnb" event={"ID":"84677826-310d-4103-8976-179fea4591d4","Type":"ContainerStarted","Data":"fde802f31f8372f828f5a31ee2a1f0aec733019412a70bcea1c2c11f51d6ef38"} Dec 09 15:11:48 crc kubenswrapper[4555]: I1209 15:11:48.062521 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-9vk9m" event={"ID":"bbbfd9c8-69ae-4dbb-99b1-1a62073759b3","Type":"ContainerStarted","Data":"e8a476526315738586251a98b07f069b0011f45fd31fc9da30bee6ce38c3cf4e"} Dec 09 15:11:48 crc kubenswrapper[4555]: I1209 15:11:48.063551 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" event={"ID":"9733ac5a-b591-46d7-ab89-bc55453c8330","Type":"ContainerStarted","Data":"ae001552e10e59dbdcbf2208e8ec988cc338cc617c4fddd08d6bd8a9a5104494"} Dec 09 15:11:48 crc kubenswrapper[4555]: I1209 15:11:48.064440 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"9f161ea6-77ce-41a8-8a32-92006cadcaa4","Type":"ContainerStarted","Data":"80f0fb4b7b9cc19f0b7e1005ffec9076d58db2485839bea87c523ba4d331b713"} Dec 09 15:11:48 crc kubenswrapper[4555]: I1209 15:11:48.065273 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" event={"ID":"2e9a80cf-bda4-4c6f-be38-028a906afc09","Type":"ContainerStarted","Data":"564488d0e5420757743215a22d444be8591eee8f3d1358670c2cfec9a44cda30"} Dec 09 15:11:48 crc kubenswrapper[4555]: I1209 15:11:48.122631 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 09 15:11:48 crc kubenswrapper[4555]: W1209 15:11:48.132889 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37512310_d9d3_46a8_aba6_7699f4ed2bec.slice/crio-074cb6fac3816356f3c68c6499870ef0cef75d7bee7751f079b46942a3aa587d WatchSource:0}: Error finding container 074cb6fac3816356f3c68c6499870ef0cef75d7bee7751f079b46942a3aa587d: Status 404 returned error can't find the container with id 074cb6fac3816356f3c68c6499870ef0cef75d7bee7751f079b46942a3aa587d Dec 09 15:11:48 crc kubenswrapper[4555]: I1209 15:11:48.179500 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 09 15:11:48 crc kubenswrapper[4555]: W1209 15:11:48.181625 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda95fb43d_d122_4136_8502_780c693b27c5.slice/crio-0aea2f45f4632297adcf48851284e3e930ecfc26f22951b8a56483309d41193f WatchSource:0}: Error finding container 0aea2f45f4632297adcf48851284e3e930ecfc26f22951b8a56483309d41193f: Status 404 returned error can't find the container with id 0aea2f45f4632297adcf48851284e3e930ecfc26f22951b8a56483309d41193f Dec 09 15:11:49 crc kubenswrapper[4555]: I1209 15:11:49.131911 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"a95fb43d-d122-4136-8502-780c693b27c5","Type":"ContainerStarted","Data":"0aea2f45f4632297adcf48851284e3e930ecfc26f22951b8a56483309d41193f"} Dec 09 15:11:49 crc kubenswrapper[4555]: I1209 15:11:49.133709 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"37512310-d9d3-46a8-aba6-7699f4ed2bec","Type":"ContainerStarted","Data":"074cb6fac3816356f3c68c6499870ef0cef75d7bee7751f079b46942a3aa587d"} Dec 09 15:11:51 crc kubenswrapper[4555]: I1209 15:11:51.157354 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" event={"ID":"9733ac5a-b591-46d7-ab89-bc55453c8330","Type":"ContainerStarted","Data":"f692fb820a9812473b75297ac024d9d7abda96569fa1abbf76a2004a53340e1f"} Dec 09 15:11:51 crc kubenswrapper[4555]: I1209 15:11:51.158976 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" event={"ID":"2e9a80cf-bda4-4c6f-be38-028a906afc09","Type":"ContainerStarted","Data":"173824fe2f4b31206d27003d901b9b976354c2d93fcd1a3a70dcdf370f8314c2"} Dec 09 15:11:51 crc kubenswrapper[4555]: I1209 15:11:51.160477 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-5895d59bb8-9ssm2" event={"ID":"f2672ffe-a69b-4127-95b0-077fff7c2dae","Type":"ContainerStarted","Data":"048893e909245f9a7dc64bc75a7cf5cf2edf037db83cff1751234ac00fca5d22"} Dec 09 15:11:51 crc kubenswrapper[4555]: I1209 15:11:51.160546 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-5895d59bb8-9ssm2" Dec 09 15:11:51 crc kubenswrapper[4555]: I1209 15:11:51.162883 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"9f161ea6-77ce-41a8-8a32-92006cadcaa4","Type":"ContainerStarted","Data":"afe90752b8d6e378e487437f7ed11b486beec5c33831548b0adf51e35d25c0d8"} Dec 09 15:11:51 crc kubenswrapper[4555]: I1209 15:11:51.163015 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:51 crc kubenswrapper[4555]: I1209 15:11:51.164371 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"a95fb43d-d122-4136-8502-780c693b27c5","Type":"ContainerStarted","Data":"782706f903d7247eabe29327526636d31c6c64afe484c6edc301799573d9abdd"} Dec 09 15:11:51 crc kubenswrapper[4555]: I1209 15:11:51.164448 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:11:51 crc kubenswrapper[4555]: I1209 15:11:51.165609 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"37512310-d9d3-46a8-aba6-7699f4ed2bec","Type":"ContainerStarted","Data":"687cf0448fd260ac222553c40e81166442a4f4bfcd554ecf0a4c66b10a347baa"} Dec 09 15:11:51 crc kubenswrapper[4555]: I1209 15:11:51.165713 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:11:51 crc kubenswrapper[4555]: I1209 15:11:51.166949 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-76cc67bf56-tfvnb" event={"ID":"84677826-310d-4103-8976-179fea4591d4","Type":"ContainerStarted","Data":"2fa0583ad38611994e06b19898cd575ed0fcc8cca3d68874acc3c9e7b694f821"} Dec 09 15:11:51 crc kubenswrapper[4555]: I1209 15:11:51.167012 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-76cc67bf56-tfvnb" Dec 09 15:11:51 crc kubenswrapper[4555]: I1209 15:11:51.168187 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-9vk9m" event={"ID":"bbbfd9c8-69ae-4dbb-99b1-1a62073759b3","Type":"ContainerStarted","Data":"6da026af583892a39dd04b2a6c4994a2343e02fdabed73f89dbd95402b824e33"} Dec 09 15:11:51 crc kubenswrapper[4555]: I1209 15:11:51.168338 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-9vk9m" Dec 09 15:11:51 crc kubenswrapper[4555]: I1209 15:11:51.176945 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-5895d59bb8-9ssm2" podStartSLOduration=2.009201961 podStartE2EDuration="5.17691804s" podCreationTimestamp="2025-12-09 15:11:46 +0000 UTC" firstStartedPulling="2025-12-09 15:11:47.279758661 +0000 UTC m=+817.416471369" lastFinishedPulling="2025-12-09 15:11:50.44747473 +0000 UTC m=+820.584187448" observedRunningTime="2025-12-09 15:11:51.171238861 +0000 UTC m=+821.307951579" watchObservedRunningTime="2025-12-09 15:11:51.17691804 +0000 UTC m=+821.313630759" Dec 09 15:11:51 crc kubenswrapper[4555]: I1209 15:11:51.186115 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-9vk9m" podStartSLOduration=1.79942615 podStartE2EDuration="5.186103612s" podCreationTimestamp="2025-12-09 15:11:46 +0000 UTC" firstStartedPulling="2025-12-09 15:11:47.172735222 +0000 UTC m=+817.309447931" lastFinishedPulling="2025-12-09 15:11:50.559412676 +0000 UTC m=+820.696125393" observedRunningTime="2025-12-09 15:11:51.185691284 +0000 UTC m=+821.322404003" watchObservedRunningTime="2025-12-09 15:11:51.186103612 +0000 UTC m=+821.322816329" Dec 09 15:11:51 crc kubenswrapper[4555]: I1209 15:11:51.203290 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=2.819359887 podStartE2EDuration="5.203269111s" podCreationTimestamp="2025-12-09 15:11:46 +0000 UTC" firstStartedPulling="2025-12-09 15:11:48.050383689 +0000 UTC m=+818.187096407" lastFinishedPulling="2025-12-09 15:11:50.434292913 +0000 UTC m=+820.571005631" observedRunningTime="2025-12-09 15:11:51.198883171 +0000 UTC m=+821.335595889" watchObservedRunningTime="2025-12-09 15:11:51.203269111 +0000 UTC m=+821.339981830" Dec 09 15:11:51 crc kubenswrapper[4555]: I1209 15:11:51.215530 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-76cc67bf56-tfvnb" podStartSLOduration=1.8448580620000001 podStartE2EDuration="5.215513842s" podCreationTimestamp="2025-12-09 15:11:46 +0000 UTC" firstStartedPulling="2025-12-09 15:11:47.075780371 +0000 UTC m=+817.212493089" lastFinishedPulling="2025-12-09 15:11:50.446436152 +0000 UTC m=+820.583148869" observedRunningTime="2025-12-09 15:11:51.211174169 +0000 UTC m=+821.347886887" watchObservedRunningTime="2025-12-09 15:11:51.215513842 +0000 UTC m=+821.352226560" Dec 09 15:11:51 crc kubenswrapper[4555]: I1209 15:11:51.231030 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=2.959804069 podStartE2EDuration="5.231010545s" podCreationTimestamp="2025-12-09 15:11:46 +0000 UTC" firstStartedPulling="2025-12-09 15:11:48.183141406 +0000 UTC m=+818.319854123" lastFinishedPulling="2025-12-09 15:11:50.454347881 +0000 UTC m=+820.591060599" observedRunningTime="2025-12-09 15:11:51.225674492 +0000 UTC m=+821.362387210" watchObservedRunningTime="2025-12-09 15:11:51.231010545 +0000 UTC m=+821.367723263" Dec 09 15:11:51 crc kubenswrapper[4555]: I1209 15:11:51.242188 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=2.921832325 podStartE2EDuration="5.242171802s" podCreationTimestamp="2025-12-09 15:11:46 +0000 UTC" firstStartedPulling="2025-12-09 15:11:48.134453763 +0000 UTC m=+818.271166482" lastFinishedPulling="2025-12-09 15:11:50.454793241 +0000 UTC m=+820.591505959" observedRunningTime="2025-12-09 15:11:51.239459978 +0000 UTC m=+821.376172696" watchObservedRunningTime="2025-12-09 15:11:51.242171802 +0000 UTC m=+821.378884520" Dec 09 15:11:54 crc kubenswrapper[4555]: I1209 15:11:54.187187 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" event={"ID":"9733ac5a-b591-46d7-ab89-bc55453c8330","Type":"ContainerStarted","Data":"ad6d393a29308951619218c9861712b8707d9aa4242b91c769f41d45a35cbfb2"} Dec 09 15:11:54 crc kubenswrapper[4555]: I1209 15:11:54.187717 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:54 crc kubenswrapper[4555]: I1209 15:11:54.187730 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:54 crc kubenswrapper[4555]: I1209 15:11:54.189891 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" event={"ID":"2e9a80cf-bda4-4c6f-be38-028a906afc09","Type":"ContainerStarted","Data":"69e306649b66fe0fa220c06b7a3bda65a7945ee260cf6fc07979b457f0ea0953"} Dec 09 15:11:54 crc kubenswrapper[4555]: I1209 15:11:54.190034 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:54 crc kubenswrapper[4555]: I1209 15:11:54.190071 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:54 crc kubenswrapper[4555]: I1209 15:11:54.193222 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:54 crc kubenswrapper[4555]: I1209 15:11:54.197078 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:54 crc kubenswrapper[4555]: I1209 15:11:54.197912 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" Dec 09 15:11:54 crc kubenswrapper[4555]: I1209 15:11:54.197974 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" Dec 09 15:11:54 crc kubenswrapper[4555]: I1209 15:11:54.203802 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-5c557c5777-mvw9s" podStartSLOduration=1.842826611 podStartE2EDuration="8.20379236s" podCreationTimestamp="2025-12-09 15:11:46 +0000 UTC" firstStartedPulling="2025-12-09 15:11:47.333950743 +0000 UTC m=+817.470663461" lastFinishedPulling="2025-12-09 15:11:53.694916492 +0000 UTC m=+823.831629210" observedRunningTime="2025-12-09 15:11:54.199882606 +0000 UTC m=+824.336595325" watchObservedRunningTime="2025-12-09 15:11:54.20379236 +0000 UTC m=+824.340505078" Dec 09 15:11:54 crc kubenswrapper[4555]: I1209 15:11:54.213255 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-5c557c5777-cgcbj" podStartSLOduration=1.912657613 podStartE2EDuration="8.213249032s" podCreationTimestamp="2025-12-09 15:11:46 +0000 UTC" firstStartedPulling="2025-12-09 15:11:47.398219367 +0000 UTC m=+817.534932074" lastFinishedPulling="2025-12-09 15:11:53.698810775 +0000 UTC m=+823.835523493" observedRunningTime="2025-12-09 15:11:54.212843758 +0000 UTC m=+824.349556476" watchObservedRunningTime="2025-12-09 15:11:54.213249032 +0000 UTC m=+824.349961750" Dec 09 15:11:55 crc kubenswrapper[4555]: I1209 15:11:55.896858 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dp95j"] Dec 09 15:11:55 crc kubenswrapper[4555]: I1209 15:11:55.898692 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dp95j" Dec 09 15:11:55 crc kubenswrapper[4555]: I1209 15:11:55.914087 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dp95j"] Dec 09 15:11:55 crc kubenswrapper[4555]: I1209 15:11:55.990846 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d0ce1ba-e894-410b-8ef1-8538e0630aef-catalog-content\") pod \"redhat-marketplace-dp95j\" (UID: \"3d0ce1ba-e894-410b-8ef1-8538e0630aef\") " pod="openshift-marketplace/redhat-marketplace-dp95j" Dec 09 15:11:55 crc kubenswrapper[4555]: I1209 15:11:55.990924 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rwqx\" (UniqueName: \"kubernetes.io/projected/3d0ce1ba-e894-410b-8ef1-8538e0630aef-kube-api-access-7rwqx\") pod \"redhat-marketplace-dp95j\" (UID: \"3d0ce1ba-e894-410b-8ef1-8538e0630aef\") " pod="openshift-marketplace/redhat-marketplace-dp95j" Dec 09 15:11:55 crc kubenswrapper[4555]: I1209 15:11:55.991130 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d0ce1ba-e894-410b-8ef1-8538e0630aef-utilities\") pod \"redhat-marketplace-dp95j\" (UID: \"3d0ce1ba-e894-410b-8ef1-8538e0630aef\") " pod="openshift-marketplace/redhat-marketplace-dp95j" Dec 09 15:11:56 crc kubenswrapper[4555]: I1209 15:11:56.092751 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d0ce1ba-e894-410b-8ef1-8538e0630aef-catalog-content\") pod \"redhat-marketplace-dp95j\" (UID: \"3d0ce1ba-e894-410b-8ef1-8538e0630aef\") " pod="openshift-marketplace/redhat-marketplace-dp95j" Dec 09 15:11:56 crc kubenswrapper[4555]: I1209 15:11:56.092807 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rwqx\" (UniqueName: \"kubernetes.io/projected/3d0ce1ba-e894-410b-8ef1-8538e0630aef-kube-api-access-7rwqx\") pod \"redhat-marketplace-dp95j\" (UID: \"3d0ce1ba-e894-410b-8ef1-8538e0630aef\") " pod="openshift-marketplace/redhat-marketplace-dp95j" Dec 09 15:11:56 crc kubenswrapper[4555]: I1209 15:11:56.092949 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d0ce1ba-e894-410b-8ef1-8538e0630aef-utilities\") pod \"redhat-marketplace-dp95j\" (UID: \"3d0ce1ba-e894-410b-8ef1-8538e0630aef\") " pod="openshift-marketplace/redhat-marketplace-dp95j" Dec 09 15:11:56 crc kubenswrapper[4555]: I1209 15:11:56.093219 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d0ce1ba-e894-410b-8ef1-8538e0630aef-catalog-content\") pod \"redhat-marketplace-dp95j\" (UID: \"3d0ce1ba-e894-410b-8ef1-8538e0630aef\") " pod="openshift-marketplace/redhat-marketplace-dp95j" Dec 09 15:11:56 crc kubenswrapper[4555]: I1209 15:11:56.093386 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d0ce1ba-e894-410b-8ef1-8538e0630aef-utilities\") pod \"redhat-marketplace-dp95j\" (UID: \"3d0ce1ba-e894-410b-8ef1-8538e0630aef\") " pod="openshift-marketplace/redhat-marketplace-dp95j" Dec 09 15:11:56 crc kubenswrapper[4555]: I1209 15:11:56.107801 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rwqx\" (UniqueName: \"kubernetes.io/projected/3d0ce1ba-e894-410b-8ef1-8538e0630aef-kube-api-access-7rwqx\") pod \"redhat-marketplace-dp95j\" (UID: \"3d0ce1ba-e894-410b-8ef1-8538e0630aef\") " pod="openshift-marketplace/redhat-marketplace-dp95j" Dec 09 15:11:56 crc kubenswrapper[4555]: I1209 15:11:56.214044 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dp95j" Dec 09 15:11:56 crc kubenswrapper[4555]: I1209 15:11:56.613509 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dp95j"] Dec 09 15:11:56 crc kubenswrapper[4555]: W1209 15:11:56.618011 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d0ce1ba_e894_410b_8ef1_8538e0630aef.slice/crio-9e94a8db3544f4ff83417d1881bbd6e5455a3baef81f9832234f74ee614d3e8d WatchSource:0}: Error finding container 9e94a8db3544f4ff83417d1881bbd6e5455a3baef81f9832234f74ee614d3e8d: Status 404 returned error can't find the container with id 9e94a8db3544f4ff83417d1881bbd6e5455a3baef81f9832234f74ee614d3e8d Dec 09 15:11:57 crc kubenswrapper[4555]: I1209 15:11:57.207402 4555 generic.go:334] "Generic (PLEG): container finished" podID="3d0ce1ba-e894-410b-8ef1-8538e0630aef" containerID="8b355f1fc8ed612fe5295957c5714218c617ab051d4c649919e79f0b7d9ad79d" exitCode=0 Dec 09 15:11:57 crc kubenswrapper[4555]: I1209 15:11:57.207508 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dp95j" event={"ID":"3d0ce1ba-e894-410b-8ef1-8538e0630aef","Type":"ContainerDied","Data":"8b355f1fc8ed612fe5295957c5714218c617ab051d4c649919e79f0b7d9ad79d"} Dec 09 15:11:57 crc kubenswrapper[4555]: I1209 15:11:57.207728 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dp95j" event={"ID":"3d0ce1ba-e894-410b-8ef1-8538e0630aef","Type":"ContainerStarted","Data":"9e94a8db3544f4ff83417d1881bbd6e5455a3baef81f9832234f74ee614d3e8d"} Dec 09 15:11:58 crc kubenswrapper[4555]: I1209 15:11:58.215531 4555 generic.go:334] "Generic (PLEG): container finished" podID="3d0ce1ba-e894-410b-8ef1-8538e0630aef" containerID="1516da30894d6d9a0ae3a06d6c6fda06123efe5b307f5a5295e37ce4a5e6452c" exitCode=0 Dec 09 15:11:58 crc kubenswrapper[4555]: I1209 15:11:58.215571 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dp95j" event={"ID":"3d0ce1ba-e894-410b-8ef1-8538e0630aef","Type":"ContainerDied","Data":"1516da30894d6d9a0ae3a06d6c6fda06123efe5b307f5a5295e37ce4a5e6452c"} Dec 09 15:11:59 crc kubenswrapper[4555]: I1209 15:11:59.224087 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dp95j" event={"ID":"3d0ce1ba-e894-410b-8ef1-8538e0630aef","Type":"ContainerStarted","Data":"056a7523f1b12447cc4b484ce4c39a17233db622d21c157eebc936652c8c184d"} Dec 09 15:11:59 crc kubenswrapper[4555]: I1209 15:11:59.243031 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dp95j" podStartSLOduration=2.755207635 podStartE2EDuration="4.243015874s" podCreationTimestamp="2025-12-09 15:11:55 +0000 UTC" firstStartedPulling="2025-12-09 15:11:57.210144919 +0000 UTC m=+827.346857637" lastFinishedPulling="2025-12-09 15:11:58.697953158 +0000 UTC m=+828.834665876" observedRunningTime="2025-12-09 15:11:59.239125218 +0000 UTC m=+829.375837936" watchObservedRunningTime="2025-12-09 15:11:59.243015874 +0000 UTC m=+829.379728593" Dec 09 15:12:06 crc kubenswrapper[4555]: I1209 15:12:06.214344 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dp95j" Dec 09 15:12:06 crc kubenswrapper[4555]: I1209 15:12:06.214718 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dp95j" Dec 09 15:12:06 crc kubenswrapper[4555]: I1209 15:12:06.251570 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dp95j" Dec 09 15:12:06 crc kubenswrapper[4555]: I1209 15:12:06.288424 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dp95j" Dec 09 15:12:06 crc kubenswrapper[4555]: I1209 15:12:06.477018 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dp95j"] Dec 09 15:12:06 crc kubenswrapper[4555]: I1209 15:12:06.562895 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-76cc67bf56-tfvnb" Dec 09 15:12:06 crc kubenswrapper[4555]: I1209 15:12:06.706977 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-5895d59bb8-9ssm2" Dec 09 15:12:06 crc kubenswrapper[4555]: I1209 15:12:06.770226 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-9vk9m" Dec 09 15:12:07 crc kubenswrapper[4555]: I1209 15:12:07.707796 4555 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Dec 09 15:12:07 crc kubenswrapper[4555]: I1209 15:12:07.707848 4555 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="9f161ea6-77ce-41a8-8a32-92006cadcaa4" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 09 15:12:07 crc kubenswrapper[4555]: I1209 15:12:07.754013 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:12:07 crc kubenswrapper[4555]: I1209 15:12:07.806157 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:12:08 crc kubenswrapper[4555]: I1209 15:12:08.269796 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dp95j" podUID="3d0ce1ba-e894-410b-8ef1-8538e0630aef" containerName="registry-server" containerID="cri-o://056a7523f1b12447cc4b484ce4c39a17233db622d21c157eebc936652c8c184d" gracePeriod=2 Dec 09 15:12:08 crc kubenswrapper[4555]: I1209 15:12:08.583284 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dp95j" Dec 09 15:12:08 crc kubenswrapper[4555]: I1209 15:12:08.692146 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rwqx\" (UniqueName: \"kubernetes.io/projected/3d0ce1ba-e894-410b-8ef1-8538e0630aef-kube-api-access-7rwqx\") pod \"3d0ce1ba-e894-410b-8ef1-8538e0630aef\" (UID: \"3d0ce1ba-e894-410b-8ef1-8538e0630aef\") " Dec 09 15:12:08 crc kubenswrapper[4555]: I1209 15:12:08.692192 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d0ce1ba-e894-410b-8ef1-8538e0630aef-utilities\") pod \"3d0ce1ba-e894-410b-8ef1-8538e0630aef\" (UID: \"3d0ce1ba-e894-410b-8ef1-8538e0630aef\") " Dec 09 15:12:08 crc kubenswrapper[4555]: I1209 15:12:08.692246 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d0ce1ba-e894-410b-8ef1-8538e0630aef-catalog-content\") pod \"3d0ce1ba-e894-410b-8ef1-8538e0630aef\" (UID: \"3d0ce1ba-e894-410b-8ef1-8538e0630aef\") " Dec 09 15:12:08 crc kubenswrapper[4555]: I1209 15:12:08.693011 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d0ce1ba-e894-410b-8ef1-8538e0630aef-utilities" (OuterVolumeSpecName: "utilities") pod "3d0ce1ba-e894-410b-8ef1-8538e0630aef" (UID: "3d0ce1ba-e894-410b-8ef1-8538e0630aef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:12:08 crc kubenswrapper[4555]: I1209 15:12:08.697054 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d0ce1ba-e894-410b-8ef1-8538e0630aef-kube-api-access-7rwqx" (OuterVolumeSpecName: "kube-api-access-7rwqx") pod "3d0ce1ba-e894-410b-8ef1-8538e0630aef" (UID: "3d0ce1ba-e894-410b-8ef1-8538e0630aef"). InnerVolumeSpecName "kube-api-access-7rwqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:12:08 crc kubenswrapper[4555]: I1209 15:12:08.705414 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d0ce1ba-e894-410b-8ef1-8538e0630aef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d0ce1ba-e894-410b-8ef1-8538e0630aef" (UID: "3d0ce1ba-e894-410b-8ef1-8538e0630aef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:12:08 crc kubenswrapper[4555]: I1209 15:12:08.793507 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rwqx\" (UniqueName: \"kubernetes.io/projected/3d0ce1ba-e894-410b-8ef1-8538e0630aef-kube-api-access-7rwqx\") on node \"crc\" DevicePath \"\"" Dec 09 15:12:08 crc kubenswrapper[4555]: I1209 15:12:08.793539 4555 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d0ce1ba-e894-410b-8ef1-8538e0630aef-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:12:08 crc kubenswrapper[4555]: I1209 15:12:08.793551 4555 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d0ce1ba-e894-410b-8ef1-8538e0630aef-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:12:09 crc kubenswrapper[4555]: I1209 15:12:09.276924 4555 generic.go:334] "Generic (PLEG): container finished" podID="3d0ce1ba-e894-410b-8ef1-8538e0630aef" containerID="056a7523f1b12447cc4b484ce4c39a17233db622d21c157eebc936652c8c184d" exitCode=0 Dec 09 15:12:09 crc kubenswrapper[4555]: I1209 15:12:09.276958 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dp95j" Dec 09 15:12:09 crc kubenswrapper[4555]: I1209 15:12:09.276972 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dp95j" event={"ID":"3d0ce1ba-e894-410b-8ef1-8538e0630aef","Type":"ContainerDied","Data":"056a7523f1b12447cc4b484ce4c39a17233db622d21c157eebc936652c8c184d"} Dec 09 15:12:09 crc kubenswrapper[4555]: I1209 15:12:09.277438 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dp95j" event={"ID":"3d0ce1ba-e894-410b-8ef1-8538e0630aef","Type":"ContainerDied","Data":"9e94a8db3544f4ff83417d1881bbd6e5455a3baef81f9832234f74ee614d3e8d"} Dec 09 15:12:09 crc kubenswrapper[4555]: I1209 15:12:09.277464 4555 scope.go:117] "RemoveContainer" containerID="056a7523f1b12447cc4b484ce4c39a17233db622d21c157eebc936652c8c184d" Dec 09 15:12:09 crc kubenswrapper[4555]: I1209 15:12:09.293558 4555 scope.go:117] "RemoveContainer" containerID="1516da30894d6d9a0ae3a06d6c6fda06123efe5b307f5a5295e37ce4a5e6452c" Dec 09 15:12:09 crc kubenswrapper[4555]: I1209 15:12:09.302036 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dp95j"] Dec 09 15:12:09 crc kubenswrapper[4555]: I1209 15:12:09.306234 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dp95j"] Dec 09 15:12:09 crc kubenswrapper[4555]: I1209 15:12:09.308168 4555 scope.go:117] "RemoveContainer" containerID="8b355f1fc8ed612fe5295957c5714218c617ab051d4c649919e79f0b7d9ad79d" Dec 09 15:12:09 crc kubenswrapper[4555]: I1209 15:12:09.342905 4555 scope.go:117] "RemoveContainer" containerID="056a7523f1b12447cc4b484ce4c39a17233db622d21c157eebc936652c8c184d" Dec 09 15:12:09 crc kubenswrapper[4555]: E1209 15:12:09.343240 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"056a7523f1b12447cc4b484ce4c39a17233db622d21c157eebc936652c8c184d\": container with ID starting with 056a7523f1b12447cc4b484ce4c39a17233db622d21c157eebc936652c8c184d not found: ID does not exist" containerID="056a7523f1b12447cc4b484ce4c39a17233db622d21c157eebc936652c8c184d" Dec 09 15:12:09 crc kubenswrapper[4555]: I1209 15:12:09.343268 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"056a7523f1b12447cc4b484ce4c39a17233db622d21c157eebc936652c8c184d"} err="failed to get container status \"056a7523f1b12447cc4b484ce4c39a17233db622d21c157eebc936652c8c184d\": rpc error: code = NotFound desc = could not find container \"056a7523f1b12447cc4b484ce4c39a17233db622d21c157eebc936652c8c184d\": container with ID starting with 056a7523f1b12447cc4b484ce4c39a17233db622d21c157eebc936652c8c184d not found: ID does not exist" Dec 09 15:12:09 crc kubenswrapper[4555]: I1209 15:12:09.343287 4555 scope.go:117] "RemoveContainer" containerID="1516da30894d6d9a0ae3a06d6c6fda06123efe5b307f5a5295e37ce4a5e6452c" Dec 09 15:12:09 crc kubenswrapper[4555]: E1209 15:12:09.343569 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1516da30894d6d9a0ae3a06d6c6fda06123efe5b307f5a5295e37ce4a5e6452c\": container with ID starting with 1516da30894d6d9a0ae3a06d6c6fda06123efe5b307f5a5295e37ce4a5e6452c not found: ID does not exist" containerID="1516da30894d6d9a0ae3a06d6c6fda06123efe5b307f5a5295e37ce4a5e6452c" Dec 09 15:12:09 crc kubenswrapper[4555]: I1209 15:12:09.343601 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1516da30894d6d9a0ae3a06d6c6fda06123efe5b307f5a5295e37ce4a5e6452c"} err="failed to get container status \"1516da30894d6d9a0ae3a06d6c6fda06123efe5b307f5a5295e37ce4a5e6452c\": rpc error: code = NotFound desc = could not find container \"1516da30894d6d9a0ae3a06d6c6fda06123efe5b307f5a5295e37ce4a5e6452c\": container with ID starting with 1516da30894d6d9a0ae3a06d6c6fda06123efe5b307f5a5295e37ce4a5e6452c not found: ID does not exist" Dec 09 15:12:09 crc kubenswrapper[4555]: I1209 15:12:09.343622 4555 scope.go:117] "RemoveContainer" containerID="8b355f1fc8ed612fe5295957c5714218c617ab051d4c649919e79f0b7d9ad79d" Dec 09 15:12:09 crc kubenswrapper[4555]: E1209 15:12:09.343893 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b355f1fc8ed612fe5295957c5714218c617ab051d4c649919e79f0b7d9ad79d\": container with ID starting with 8b355f1fc8ed612fe5295957c5714218c617ab051d4c649919e79f0b7d9ad79d not found: ID does not exist" containerID="8b355f1fc8ed612fe5295957c5714218c617ab051d4c649919e79f0b7d9ad79d" Dec 09 15:12:09 crc kubenswrapper[4555]: I1209 15:12:09.343916 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b355f1fc8ed612fe5295957c5714218c617ab051d4c649919e79f0b7d9ad79d"} err="failed to get container status \"8b355f1fc8ed612fe5295957c5714218c617ab051d4c649919e79f0b7d9ad79d\": rpc error: code = NotFound desc = could not find container \"8b355f1fc8ed612fe5295957c5714218c617ab051d4c649919e79f0b7d9ad79d\": container with ID starting with 8b355f1fc8ed612fe5295957c5714218c617ab051d4c649919e79f0b7d9ad79d not found: ID does not exist" Dec 09 15:12:10 crc kubenswrapper[4555]: I1209 15:12:10.167288 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d0ce1ba-e894-410b-8ef1-8538e0630aef" path="/var/lib/kubelet/pods/3d0ce1ba-e894-410b-8ef1-8538e0630aef/volumes" Dec 09 15:12:17 crc kubenswrapper[4555]: I1209 15:12:17.707406 4555 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Dec 09 15:12:17 crc kubenswrapper[4555]: I1209 15:12:17.707786 4555 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="9f161ea6-77ce-41a8-8a32-92006cadcaa4" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 09 15:12:24 crc kubenswrapper[4555]: I1209 15:12:24.505187 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7qdzk"] Dec 09 15:12:24 crc kubenswrapper[4555]: E1209 15:12:24.505746 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d0ce1ba-e894-410b-8ef1-8538e0630aef" containerName="extract-content" Dec 09 15:12:24 crc kubenswrapper[4555]: I1209 15:12:24.505757 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d0ce1ba-e894-410b-8ef1-8538e0630aef" containerName="extract-content" Dec 09 15:12:24 crc kubenswrapper[4555]: E1209 15:12:24.505773 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d0ce1ba-e894-410b-8ef1-8538e0630aef" containerName="extract-utilities" Dec 09 15:12:24 crc kubenswrapper[4555]: I1209 15:12:24.505778 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d0ce1ba-e894-410b-8ef1-8538e0630aef" containerName="extract-utilities" Dec 09 15:12:24 crc kubenswrapper[4555]: E1209 15:12:24.505790 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d0ce1ba-e894-410b-8ef1-8538e0630aef" containerName="registry-server" Dec 09 15:12:24 crc kubenswrapper[4555]: I1209 15:12:24.505795 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d0ce1ba-e894-410b-8ef1-8538e0630aef" containerName="registry-server" Dec 09 15:12:24 crc kubenswrapper[4555]: I1209 15:12:24.505913 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d0ce1ba-e894-410b-8ef1-8538e0630aef" containerName="registry-server" Dec 09 15:12:24 crc kubenswrapper[4555]: I1209 15:12:24.506714 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7qdzk" Dec 09 15:12:24 crc kubenswrapper[4555]: I1209 15:12:24.511550 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7qdzk"] Dec 09 15:12:24 crc kubenswrapper[4555]: I1209 15:12:24.676441 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27f6e849-7d90-4c6a-8a8a-f5cb0692d86d-utilities\") pod \"certified-operators-7qdzk\" (UID: \"27f6e849-7d90-4c6a-8a8a-f5cb0692d86d\") " pod="openshift-marketplace/certified-operators-7qdzk" Dec 09 15:12:24 crc kubenswrapper[4555]: I1209 15:12:24.676507 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf4kq\" (UniqueName: \"kubernetes.io/projected/27f6e849-7d90-4c6a-8a8a-f5cb0692d86d-kube-api-access-kf4kq\") pod \"certified-operators-7qdzk\" (UID: \"27f6e849-7d90-4c6a-8a8a-f5cb0692d86d\") " pod="openshift-marketplace/certified-operators-7qdzk" Dec 09 15:12:24 crc kubenswrapper[4555]: I1209 15:12:24.676565 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27f6e849-7d90-4c6a-8a8a-f5cb0692d86d-catalog-content\") pod \"certified-operators-7qdzk\" (UID: \"27f6e849-7d90-4c6a-8a8a-f5cb0692d86d\") " pod="openshift-marketplace/certified-operators-7qdzk" Dec 09 15:12:24 crc kubenswrapper[4555]: I1209 15:12:24.778130 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27f6e849-7d90-4c6a-8a8a-f5cb0692d86d-catalog-content\") pod \"certified-operators-7qdzk\" (UID: \"27f6e849-7d90-4c6a-8a8a-f5cb0692d86d\") " pod="openshift-marketplace/certified-operators-7qdzk" Dec 09 15:12:24 crc kubenswrapper[4555]: I1209 15:12:24.778266 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27f6e849-7d90-4c6a-8a8a-f5cb0692d86d-utilities\") pod \"certified-operators-7qdzk\" (UID: \"27f6e849-7d90-4c6a-8a8a-f5cb0692d86d\") " pod="openshift-marketplace/certified-operators-7qdzk" Dec 09 15:12:24 crc kubenswrapper[4555]: I1209 15:12:24.778320 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf4kq\" (UniqueName: \"kubernetes.io/projected/27f6e849-7d90-4c6a-8a8a-f5cb0692d86d-kube-api-access-kf4kq\") pod \"certified-operators-7qdzk\" (UID: \"27f6e849-7d90-4c6a-8a8a-f5cb0692d86d\") " pod="openshift-marketplace/certified-operators-7qdzk" Dec 09 15:12:24 crc kubenswrapper[4555]: I1209 15:12:24.778568 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27f6e849-7d90-4c6a-8a8a-f5cb0692d86d-catalog-content\") pod \"certified-operators-7qdzk\" (UID: \"27f6e849-7d90-4c6a-8a8a-f5cb0692d86d\") " pod="openshift-marketplace/certified-operators-7qdzk" Dec 09 15:12:24 crc kubenswrapper[4555]: I1209 15:12:24.778660 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27f6e849-7d90-4c6a-8a8a-f5cb0692d86d-utilities\") pod \"certified-operators-7qdzk\" (UID: \"27f6e849-7d90-4c6a-8a8a-f5cb0692d86d\") " pod="openshift-marketplace/certified-operators-7qdzk" Dec 09 15:12:24 crc kubenswrapper[4555]: I1209 15:12:24.793973 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf4kq\" (UniqueName: \"kubernetes.io/projected/27f6e849-7d90-4c6a-8a8a-f5cb0692d86d-kube-api-access-kf4kq\") pod \"certified-operators-7qdzk\" (UID: \"27f6e849-7d90-4c6a-8a8a-f5cb0692d86d\") " pod="openshift-marketplace/certified-operators-7qdzk" Dec 09 15:12:24 crc kubenswrapper[4555]: I1209 15:12:24.827649 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7qdzk" Dec 09 15:12:25 crc kubenswrapper[4555]: I1209 15:12:25.226872 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7qdzk"] Dec 09 15:12:25 crc kubenswrapper[4555]: I1209 15:12:25.353884 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7qdzk" event={"ID":"27f6e849-7d90-4c6a-8a8a-f5cb0692d86d","Type":"ContainerStarted","Data":"29fab9806c12633e5c5eba609c9d11bf5049ab47628ade245dc66b4cf1f405ee"} Dec 09 15:12:25 crc kubenswrapper[4555]: I1209 15:12:25.353921 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7qdzk" event={"ID":"27f6e849-7d90-4c6a-8a8a-f5cb0692d86d","Type":"ContainerStarted","Data":"7ec1aa1869dbf75b174d94847455c31ba3a1b6813abd6242cb3434f62c662777"} Dec 09 15:12:26 crc kubenswrapper[4555]: I1209 15:12:26.360456 4555 generic.go:334] "Generic (PLEG): container finished" podID="27f6e849-7d90-4c6a-8a8a-f5cb0692d86d" containerID="29fab9806c12633e5c5eba609c9d11bf5049ab47628ade245dc66b4cf1f405ee" exitCode=0 Dec 09 15:12:26 crc kubenswrapper[4555]: I1209 15:12:26.360634 4555 generic.go:334] "Generic (PLEG): container finished" podID="27f6e849-7d90-4c6a-8a8a-f5cb0692d86d" containerID="ddc189e1c3d659c263a67035c39b921f8934cbfdb0d0aba5430662c49a699577" exitCode=0 Dec 09 15:12:26 crc kubenswrapper[4555]: I1209 15:12:26.360580 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7qdzk" event={"ID":"27f6e849-7d90-4c6a-8a8a-f5cb0692d86d","Type":"ContainerDied","Data":"29fab9806c12633e5c5eba609c9d11bf5049ab47628ade245dc66b4cf1f405ee"} Dec 09 15:12:26 crc kubenswrapper[4555]: I1209 15:12:26.360668 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7qdzk" event={"ID":"27f6e849-7d90-4c6a-8a8a-f5cb0692d86d","Type":"ContainerDied","Data":"ddc189e1c3d659c263a67035c39b921f8934cbfdb0d0aba5430662c49a699577"} Dec 09 15:12:27 crc kubenswrapper[4555]: I1209 15:12:27.367202 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7qdzk" event={"ID":"27f6e849-7d90-4c6a-8a8a-f5cb0692d86d","Type":"ContainerStarted","Data":"e18f31a1dced0fc90a90379453a8d4a7429e37ffdcd91d0b2df38fe6c8281aff"} Dec 09 15:12:27 crc kubenswrapper[4555]: I1209 15:12:27.380541 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7qdzk" podStartSLOduration=1.843986248 podStartE2EDuration="3.38052931s" podCreationTimestamp="2025-12-09 15:12:24 +0000 UTC" firstStartedPulling="2025-12-09 15:12:25.355419109 +0000 UTC m=+855.492131816" lastFinishedPulling="2025-12-09 15:12:26.891962159 +0000 UTC m=+857.028674878" observedRunningTime="2025-12-09 15:12:27.378033332 +0000 UTC m=+857.514746051" watchObservedRunningTime="2025-12-09 15:12:27.38052931 +0000 UTC m=+857.517242028" Dec 09 15:12:27 crc kubenswrapper[4555]: I1209 15:12:27.707568 4555 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Dec 09 15:12:27 crc kubenswrapper[4555]: I1209 15:12:27.707724 4555 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="9f161ea6-77ce-41a8-8a32-92006cadcaa4" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 09 15:12:30 crc kubenswrapper[4555]: I1209 15:12:30.895946 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pzh79"] Dec 09 15:12:30 crc kubenswrapper[4555]: I1209 15:12:30.897403 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzh79" Dec 09 15:12:30 crc kubenswrapper[4555]: I1209 15:12:30.904363 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pzh79"] Dec 09 15:12:30 crc kubenswrapper[4555]: I1209 15:12:30.945276 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef88759b-ff68-4af7-8067-0d2cbd27ecc6-utilities\") pod \"community-operators-pzh79\" (UID: \"ef88759b-ff68-4af7-8067-0d2cbd27ecc6\") " pod="openshift-marketplace/community-operators-pzh79" Dec 09 15:12:30 crc kubenswrapper[4555]: I1209 15:12:30.945398 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hgk6\" (UniqueName: \"kubernetes.io/projected/ef88759b-ff68-4af7-8067-0d2cbd27ecc6-kube-api-access-6hgk6\") pod \"community-operators-pzh79\" (UID: \"ef88759b-ff68-4af7-8067-0d2cbd27ecc6\") " pod="openshift-marketplace/community-operators-pzh79" Dec 09 15:12:30 crc kubenswrapper[4555]: I1209 15:12:30.945471 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef88759b-ff68-4af7-8067-0d2cbd27ecc6-catalog-content\") pod \"community-operators-pzh79\" (UID: \"ef88759b-ff68-4af7-8067-0d2cbd27ecc6\") " pod="openshift-marketplace/community-operators-pzh79" Dec 09 15:12:31 crc kubenswrapper[4555]: I1209 15:12:31.046702 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hgk6\" (UniqueName: \"kubernetes.io/projected/ef88759b-ff68-4af7-8067-0d2cbd27ecc6-kube-api-access-6hgk6\") pod \"community-operators-pzh79\" (UID: \"ef88759b-ff68-4af7-8067-0d2cbd27ecc6\") " pod="openshift-marketplace/community-operators-pzh79" Dec 09 15:12:31 crc kubenswrapper[4555]: I1209 15:12:31.046759 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef88759b-ff68-4af7-8067-0d2cbd27ecc6-catalog-content\") pod \"community-operators-pzh79\" (UID: \"ef88759b-ff68-4af7-8067-0d2cbd27ecc6\") " pod="openshift-marketplace/community-operators-pzh79" Dec 09 15:12:31 crc kubenswrapper[4555]: I1209 15:12:31.046858 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef88759b-ff68-4af7-8067-0d2cbd27ecc6-utilities\") pod \"community-operators-pzh79\" (UID: \"ef88759b-ff68-4af7-8067-0d2cbd27ecc6\") " pod="openshift-marketplace/community-operators-pzh79" Dec 09 15:12:31 crc kubenswrapper[4555]: I1209 15:12:31.047123 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef88759b-ff68-4af7-8067-0d2cbd27ecc6-catalog-content\") pod \"community-operators-pzh79\" (UID: \"ef88759b-ff68-4af7-8067-0d2cbd27ecc6\") " pod="openshift-marketplace/community-operators-pzh79" Dec 09 15:12:31 crc kubenswrapper[4555]: I1209 15:12:31.047204 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef88759b-ff68-4af7-8067-0d2cbd27ecc6-utilities\") pod \"community-operators-pzh79\" (UID: \"ef88759b-ff68-4af7-8067-0d2cbd27ecc6\") " pod="openshift-marketplace/community-operators-pzh79" Dec 09 15:12:31 crc kubenswrapper[4555]: I1209 15:12:31.063285 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hgk6\" (UniqueName: \"kubernetes.io/projected/ef88759b-ff68-4af7-8067-0d2cbd27ecc6-kube-api-access-6hgk6\") pod \"community-operators-pzh79\" (UID: \"ef88759b-ff68-4af7-8067-0d2cbd27ecc6\") " pod="openshift-marketplace/community-operators-pzh79" Dec 09 15:12:31 crc kubenswrapper[4555]: I1209 15:12:31.215927 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzh79" Dec 09 15:12:31 crc kubenswrapper[4555]: I1209 15:12:31.614161 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pzh79"] Dec 09 15:12:31 crc kubenswrapper[4555]: W1209 15:12:31.616637 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef88759b_ff68_4af7_8067_0d2cbd27ecc6.slice/crio-1a3f825727bcd086c87deab13f7faae1779f2d9e97f6d192c246dcbb7d717d83 WatchSource:0}: Error finding container 1a3f825727bcd086c87deab13f7faae1779f2d9e97f6d192c246dcbb7d717d83: Status 404 returned error can't find the container with id 1a3f825727bcd086c87deab13f7faae1779f2d9e97f6d192c246dcbb7d717d83 Dec 09 15:12:32 crc kubenswrapper[4555]: I1209 15:12:32.393483 4555 generic.go:334] "Generic (PLEG): container finished" podID="ef88759b-ff68-4af7-8067-0d2cbd27ecc6" containerID="6669ececb4bbd1b0a0d11d0aa89382eb7fd96d9c0104591a4dc04b032820593c" exitCode=0 Dec 09 15:12:32 crc kubenswrapper[4555]: I1209 15:12:32.393556 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzh79" event={"ID":"ef88759b-ff68-4af7-8067-0d2cbd27ecc6","Type":"ContainerDied","Data":"6669ececb4bbd1b0a0d11d0aa89382eb7fd96d9c0104591a4dc04b032820593c"} Dec 09 15:12:32 crc kubenswrapper[4555]: I1209 15:12:32.393698 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzh79" event={"ID":"ef88759b-ff68-4af7-8067-0d2cbd27ecc6","Type":"ContainerStarted","Data":"1a3f825727bcd086c87deab13f7faae1779f2d9e97f6d192c246dcbb7d717d83"} Dec 09 15:12:33 crc kubenswrapper[4555]: I1209 15:12:33.399547 4555 generic.go:334] "Generic (PLEG): container finished" podID="ef88759b-ff68-4af7-8067-0d2cbd27ecc6" containerID="562aa99333052a6aa858a2ec99d43d5f644c3b774a97f6c23c4bead504201f6f" exitCode=0 Dec 09 15:12:33 crc kubenswrapper[4555]: I1209 15:12:33.399589 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzh79" event={"ID":"ef88759b-ff68-4af7-8067-0d2cbd27ecc6","Type":"ContainerDied","Data":"562aa99333052a6aa858a2ec99d43d5f644c3b774a97f6c23c4bead504201f6f"} Dec 09 15:12:34 crc kubenswrapper[4555]: I1209 15:12:34.406297 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzh79" event={"ID":"ef88759b-ff68-4af7-8067-0d2cbd27ecc6","Type":"ContainerStarted","Data":"d90d870338d283edb5f679b2bc5a75307ef251574faa9b0c047a57d17f64a920"} Dec 09 15:12:34 crc kubenswrapper[4555]: I1209 15:12:34.421177 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pzh79" podStartSLOduration=2.926252472 podStartE2EDuration="4.421163176s" podCreationTimestamp="2025-12-09 15:12:30 +0000 UTC" firstStartedPulling="2025-12-09 15:12:32.39515938 +0000 UTC m=+862.531872097" lastFinishedPulling="2025-12-09 15:12:33.890070083 +0000 UTC m=+864.026782801" observedRunningTime="2025-12-09 15:12:34.418955282 +0000 UTC m=+864.555668001" watchObservedRunningTime="2025-12-09 15:12:34.421163176 +0000 UTC m=+864.557875893" Dec 09 15:12:34 crc kubenswrapper[4555]: I1209 15:12:34.828723 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7qdzk" Dec 09 15:12:34 crc kubenswrapper[4555]: I1209 15:12:34.828765 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7qdzk" Dec 09 15:12:34 crc kubenswrapper[4555]: I1209 15:12:34.857347 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7qdzk" Dec 09 15:12:35 crc kubenswrapper[4555]: I1209 15:12:35.437379 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7qdzk" Dec 09 15:12:37 crc kubenswrapper[4555]: I1209 15:12:37.090800 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7qdzk"] Dec 09 15:12:37 crc kubenswrapper[4555]: I1209 15:12:37.422262 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7qdzk" podUID="27f6e849-7d90-4c6a-8a8a-f5cb0692d86d" containerName="registry-server" containerID="cri-o://e18f31a1dced0fc90a90379453a8d4a7429e37ffdcd91d0b2df38fe6c8281aff" gracePeriod=2 Dec 09 15:12:37 crc kubenswrapper[4555]: I1209 15:12:37.707158 4555 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Dec 09 15:12:37 crc kubenswrapper[4555]: I1209 15:12:37.707443 4555 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="9f161ea6-77ce-41a8-8a32-92006cadcaa4" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 09 15:12:37 crc kubenswrapper[4555]: I1209 15:12:37.744356 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7qdzk" Dec 09 15:12:37 crc kubenswrapper[4555]: I1209 15:12:37.829068 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27f6e849-7d90-4c6a-8a8a-f5cb0692d86d-utilities\") pod \"27f6e849-7d90-4c6a-8a8a-f5cb0692d86d\" (UID: \"27f6e849-7d90-4c6a-8a8a-f5cb0692d86d\") " Dec 09 15:12:37 crc kubenswrapper[4555]: I1209 15:12:37.829261 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf4kq\" (UniqueName: \"kubernetes.io/projected/27f6e849-7d90-4c6a-8a8a-f5cb0692d86d-kube-api-access-kf4kq\") pod \"27f6e849-7d90-4c6a-8a8a-f5cb0692d86d\" (UID: \"27f6e849-7d90-4c6a-8a8a-f5cb0692d86d\") " Dec 09 15:12:37 crc kubenswrapper[4555]: I1209 15:12:37.829548 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27f6e849-7d90-4c6a-8a8a-f5cb0692d86d-catalog-content\") pod \"27f6e849-7d90-4c6a-8a8a-f5cb0692d86d\" (UID: \"27f6e849-7d90-4c6a-8a8a-f5cb0692d86d\") " Dec 09 15:12:37 crc kubenswrapper[4555]: I1209 15:12:37.830065 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27f6e849-7d90-4c6a-8a8a-f5cb0692d86d-utilities" (OuterVolumeSpecName: "utilities") pod "27f6e849-7d90-4c6a-8a8a-f5cb0692d86d" (UID: "27f6e849-7d90-4c6a-8a8a-f5cb0692d86d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:12:37 crc kubenswrapper[4555]: I1209 15:12:37.834247 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27f6e849-7d90-4c6a-8a8a-f5cb0692d86d-kube-api-access-kf4kq" (OuterVolumeSpecName: "kube-api-access-kf4kq") pod "27f6e849-7d90-4c6a-8a8a-f5cb0692d86d" (UID: "27f6e849-7d90-4c6a-8a8a-f5cb0692d86d"). InnerVolumeSpecName "kube-api-access-kf4kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:12:37 crc kubenswrapper[4555]: I1209 15:12:37.864925 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27f6e849-7d90-4c6a-8a8a-f5cb0692d86d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27f6e849-7d90-4c6a-8a8a-f5cb0692d86d" (UID: "27f6e849-7d90-4c6a-8a8a-f5cb0692d86d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:12:37 crc kubenswrapper[4555]: I1209 15:12:37.931508 4555 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27f6e849-7d90-4c6a-8a8a-f5cb0692d86d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:12:37 crc kubenswrapper[4555]: I1209 15:12:37.931541 4555 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27f6e849-7d90-4c6a-8a8a-f5cb0692d86d-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:12:37 crc kubenswrapper[4555]: I1209 15:12:37.931552 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf4kq\" (UniqueName: \"kubernetes.io/projected/27f6e849-7d90-4c6a-8a8a-f5cb0692d86d-kube-api-access-kf4kq\") on node \"crc\" DevicePath \"\"" Dec 09 15:12:38 crc kubenswrapper[4555]: I1209 15:12:38.427609 4555 generic.go:334] "Generic (PLEG): container finished" podID="27f6e849-7d90-4c6a-8a8a-f5cb0692d86d" containerID="e18f31a1dced0fc90a90379453a8d4a7429e37ffdcd91d0b2df38fe6c8281aff" exitCode=0 Dec 09 15:12:38 crc kubenswrapper[4555]: I1209 15:12:38.427647 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7qdzk" Dec 09 15:12:38 crc kubenswrapper[4555]: I1209 15:12:38.427661 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7qdzk" event={"ID":"27f6e849-7d90-4c6a-8a8a-f5cb0692d86d","Type":"ContainerDied","Data":"e18f31a1dced0fc90a90379453a8d4a7429e37ffdcd91d0b2df38fe6c8281aff"} Dec 09 15:12:38 crc kubenswrapper[4555]: I1209 15:12:38.427692 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7qdzk" event={"ID":"27f6e849-7d90-4c6a-8a8a-f5cb0692d86d","Type":"ContainerDied","Data":"7ec1aa1869dbf75b174d94847455c31ba3a1b6813abd6242cb3434f62c662777"} Dec 09 15:12:38 crc kubenswrapper[4555]: I1209 15:12:38.427707 4555 scope.go:117] "RemoveContainer" containerID="e18f31a1dced0fc90a90379453a8d4a7429e37ffdcd91d0b2df38fe6c8281aff" Dec 09 15:12:38 crc kubenswrapper[4555]: I1209 15:12:38.441942 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7qdzk"] Dec 09 15:12:38 crc kubenswrapper[4555]: I1209 15:12:38.444087 4555 scope.go:117] "RemoveContainer" containerID="ddc189e1c3d659c263a67035c39b921f8934cbfdb0d0aba5430662c49a699577" Dec 09 15:12:38 crc kubenswrapper[4555]: I1209 15:12:38.446497 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7qdzk"] Dec 09 15:12:38 crc kubenswrapper[4555]: I1209 15:12:38.457054 4555 scope.go:117] "RemoveContainer" containerID="29fab9806c12633e5c5eba609c9d11bf5049ab47628ade245dc66b4cf1f405ee" Dec 09 15:12:38 crc kubenswrapper[4555]: I1209 15:12:38.477297 4555 scope.go:117] "RemoveContainer" containerID="e18f31a1dced0fc90a90379453a8d4a7429e37ffdcd91d0b2df38fe6c8281aff" Dec 09 15:12:38 crc kubenswrapper[4555]: E1209 15:12:38.477916 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e18f31a1dced0fc90a90379453a8d4a7429e37ffdcd91d0b2df38fe6c8281aff\": container with ID starting with e18f31a1dced0fc90a90379453a8d4a7429e37ffdcd91d0b2df38fe6c8281aff not found: ID does not exist" containerID="e18f31a1dced0fc90a90379453a8d4a7429e37ffdcd91d0b2df38fe6c8281aff" Dec 09 15:12:38 crc kubenswrapper[4555]: I1209 15:12:38.477948 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e18f31a1dced0fc90a90379453a8d4a7429e37ffdcd91d0b2df38fe6c8281aff"} err="failed to get container status \"e18f31a1dced0fc90a90379453a8d4a7429e37ffdcd91d0b2df38fe6c8281aff\": rpc error: code = NotFound desc = could not find container \"e18f31a1dced0fc90a90379453a8d4a7429e37ffdcd91d0b2df38fe6c8281aff\": container with ID starting with e18f31a1dced0fc90a90379453a8d4a7429e37ffdcd91d0b2df38fe6c8281aff not found: ID does not exist" Dec 09 15:12:38 crc kubenswrapper[4555]: I1209 15:12:38.477967 4555 scope.go:117] "RemoveContainer" containerID="ddc189e1c3d659c263a67035c39b921f8934cbfdb0d0aba5430662c49a699577" Dec 09 15:12:38 crc kubenswrapper[4555]: E1209 15:12:38.478275 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddc189e1c3d659c263a67035c39b921f8934cbfdb0d0aba5430662c49a699577\": container with ID starting with ddc189e1c3d659c263a67035c39b921f8934cbfdb0d0aba5430662c49a699577 not found: ID does not exist" containerID="ddc189e1c3d659c263a67035c39b921f8934cbfdb0d0aba5430662c49a699577" Dec 09 15:12:38 crc kubenswrapper[4555]: I1209 15:12:38.478345 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddc189e1c3d659c263a67035c39b921f8934cbfdb0d0aba5430662c49a699577"} err="failed to get container status \"ddc189e1c3d659c263a67035c39b921f8934cbfdb0d0aba5430662c49a699577\": rpc error: code = NotFound desc = could not find container \"ddc189e1c3d659c263a67035c39b921f8934cbfdb0d0aba5430662c49a699577\": container with ID starting with ddc189e1c3d659c263a67035c39b921f8934cbfdb0d0aba5430662c49a699577 not found: ID does not exist" Dec 09 15:12:38 crc kubenswrapper[4555]: I1209 15:12:38.478378 4555 scope.go:117] "RemoveContainer" containerID="29fab9806c12633e5c5eba609c9d11bf5049ab47628ade245dc66b4cf1f405ee" Dec 09 15:12:38 crc kubenswrapper[4555]: E1209 15:12:38.478837 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29fab9806c12633e5c5eba609c9d11bf5049ab47628ade245dc66b4cf1f405ee\": container with ID starting with 29fab9806c12633e5c5eba609c9d11bf5049ab47628ade245dc66b4cf1f405ee not found: ID does not exist" containerID="29fab9806c12633e5c5eba609c9d11bf5049ab47628ade245dc66b4cf1f405ee" Dec 09 15:12:38 crc kubenswrapper[4555]: I1209 15:12:38.478875 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29fab9806c12633e5c5eba609c9d11bf5049ab47628ade245dc66b4cf1f405ee"} err="failed to get container status \"29fab9806c12633e5c5eba609c9d11bf5049ab47628ade245dc66b4cf1f405ee\": rpc error: code = NotFound desc = could not find container \"29fab9806c12633e5c5eba609c9d11bf5049ab47628ade245dc66b4cf1f405ee\": container with ID starting with 29fab9806c12633e5c5eba609c9d11bf5049ab47628ade245dc66b4cf1f405ee not found: ID does not exist" Dec 09 15:12:40 crc kubenswrapper[4555]: I1209 15:12:40.167033 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27f6e849-7d90-4c6a-8a8a-f5cb0692d86d" path="/var/lib/kubelet/pods/27f6e849-7d90-4c6a-8a8a-f5cb0692d86d/volumes" Dec 09 15:12:41 crc kubenswrapper[4555]: I1209 15:12:41.216486 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pzh79" Dec 09 15:12:41 crc kubenswrapper[4555]: I1209 15:12:41.217437 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pzh79" Dec 09 15:12:41 crc kubenswrapper[4555]: I1209 15:12:41.243799 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pzh79" Dec 09 15:12:41 crc kubenswrapper[4555]: I1209 15:12:41.486266 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pzh79" Dec 09 15:12:42 crc kubenswrapper[4555]: I1209 15:12:42.293405 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pzh79"] Dec 09 15:12:43 crc kubenswrapper[4555]: I1209 15:12:43.469016 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pzh79" podUID="ef88759b-ff68-4af7-8067-0d2cbd27ecc6" containerName="registry-server" containerID="cri-o://d90d870338d283edb5f679b2bc5a75307ef251574faa9b0c047a57d17f64a920" gracePeriod=2 Dec 09 15:12:44 crc kubenswrapper[4555]: I1209 15:12:44.282752 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzh79" Dec 09 15:12:44 crc kubenswrapper[4555]: I1209 15:12:44.324604 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef88759b-ff68-4af7-8067-0d2cbd27ecc6-utilities\") pod \"ef88759b-ff68-4af7-8067-0d2cbd27ecc6\" (UID: \"ef88759b-ff68-4af7-8067-0d2cbd27ecc6\") " Dec 09 15:12:44 crc kubenswrapper[4555]: I1209 15:12:44.324885 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef88759b-ff68-4af7-8067-0d2cbd27ecc6-catalog-content\") pod \"ef88759b-ff68-4af7-8067-0d2cbd27ecc6\" (UID: \"ef88759b-ff68-4af7-8067-0d2cbd27ecc6\") " Dec 09 15:12:44 crc kubenswrapper[4555]: I1209 15:12:44.325117 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hgk6\" (UniqueName: \"kubernetes.io/projected/ef88759b-ff68-4af7-8067-0d2cbd27ecc6-kube-api-access-6hgk6\") pod \"ef88759b-ff68-4af7-8067-0d2cbd27ecc6\" (UID: \"ef88759b-ff68-4af7-8067-0d2cbd27ecc6\") " Dec 09 15:12:44 crc kubenswrapper[4555]: I1209 15:12:44.325551 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef88759b-ff68-4af7-8067-0d2cbd27ecc6-utilities" (OuterVolumeSpecName: "utilities") pod "ef88759b-ff68-4af7-8067-0d2cbd27ecc6" (UID: "ef88759b-ff68-4af7-8067-0d2cbd27ecc6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:12:44 crc kubenswrapper[4555]: I1209 15:12:44.326140 4555 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef88759b-ff68-4af7-8067-0d2cbd27ecc6-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:12:44 crc kubenswrapper[4555]: I1209 15:12:44.330478 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef88759b-ff68-4af7-8067-0d2cbd27ecc6-kube-api-access-6hgk6" (OuterVolumeSpecName: "kube-api-access-6hgk6") pod "ef88759b-ff68-4af7-8067-0d2cbd27ecc6" (UID: "ef88759b-ff68-4af7-8067-0d2cbd27ecc6"). InnerVolumeSpecName "kube-api-access-6hgk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:12:44 crc kubenswrapper[4555]: I1209 15:12:44.361396 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef88759b-ff68-4af7-8067-0d2cbd27ecc6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ef88759b-ff68-4af7-8067-0d2cbd27ecc6" (UID: "ef88759b-ff68-4af7-8067-0d2cbd27ecc6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:12:44 crc kubenswrapper[4555]: I1209 15:12:44.427282 4555 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef88759b-ff68-4af7-8067-0d2cbd27ecc6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:12:44 crc kubenswrapper[4555]: I1209 15:12:44.427335 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hgk6\" (UniqueName: \"kubernetes.io/projected/ef88759b-ff68-4af7-8067-0d2cbd27ecc6-kube-api-access-6hgk6\") on node \"crc\" DevicePath \"\"" Dec 09 15:12:44 crc kubenswrapper[4555]: I1209 15:12:44.476035 4555 generic.go:334] "Generic (PLEG): container finished" podID="ef88759b-ff68-4af7-8067-0d2cbd27ecc6" containerID="d90d870338d283edb5f679b2bc5a75307ef251574faa9b0c047a57d17f64a920" exitCode=0 Dec 09 15:12:44 crc kubenswrapper[4555]: I1209 15:12:44.476091 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzh79" event={"ID":"ef88759b-ff68-4af7-8067-0d2cbd27ecc6","Type":"ContainerDied","Data":"d90d870338d283edb5f679b2bc5a75307ef251574faa9b0c047a57d17f64a920"} Dec 09 15:12:44 crc kubenswrapper[4555]: I1209 15:12:44.476127 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzh79" event={"ID":"ef88759b-ff68-4af7-8067-0d2cbd27ecc6","Type":"ContainerDied","Data":"1a3f825727bcd086c87deab13f7faae1779f2d9e97f6d192c246dcbb7d717d83"} Dec 09 15:12:44 crc kubenswrapper[4555]: I1209 15:12:44.476146 4555 scope.go:117] "RemoveContainer" containerID="d90d870338d283edb5f679b2bc5a75307ef251574faa9b0c047a57d17f64a920" Dec 09 15:12:44 crc kubenswrapper[4555]: I1209 15:12:44.476207 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzh79" Dec 09 15:12:44 crc kubenswrapper[4555]: I1209 15:12:44.490335 4555 scope.go:117] "RemoveContainer" containerID="562aa99333052a6aa858a2ec99d43d5f644c3b774a97f6c23c4bead504201f6f" Dec 09 15:12:44 crc kubenswrapper[4555]: I1209 15:12:44.503415 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pzh79"] Dec 09 15:12:44 crc kubenswrapper[4555]: I1209 15:12:44.504329 4555 scope.go:117] "RemoveContainer" containerID="6669ececb4bbd1b0a0d11d0aa89382eb7fd96d9c0104591a4dc04b032820593c" Dec 09 15:12:44 crc kubenswrapper[4555]: I1209 15:12:44.509238 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pzh79"] Dec 09 15:12:44 crc kubenswrapper[4555]: I1209 15:12:44.535622 4555 scope.go:117] "RemoveContainer" containerID="d90d870338d283edb5f679b2bc5a75307ef251574faa9b0c047a57d17f64a920" Dec 09 15:12:44 crc kubenswrapper[4555]: E1209 15:12:44.536038 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d90d870338d283edb5f679b2bc5a75307ef251574faa9b0c047a57d17f64a920\": container with ID starting with d90d870338d283edb5f679b2bc5a75307ef251574faa9b0c047a57d17f64a920 not found: ID does not exist" containerID="d90d870338d283edb5f679b2bc5a75307ef251574faa9b0c047a57d17f64a920" Dec 09 15:12:44 crc kubenswrapper[4555]: I1209 15:12:44.536074 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d90d870338d283edb5f679b2bc5a75307ef251574faa9b0c047a57d17f64a920"} err="failed to get container status \"d90d870338d283edb5f679b2bc5a75307ef251574faa9b0c047a57d17f64a920\": rpc error: code = NotFound desc = could not find container \"d90d870338d283edb5f679b2bc5a75307ef251574faa9b0c047a57d17f64a920\": container with ID starting with d90d870338d283edb5f679b2bc5a75307ef251574faa9b0c047a57d17f64a920 not found: ID does not exist" Dec 09 15:12:44 crc kubenswrapper[4555]: I1209 15:12:44.536101 4555 scope.go:117] "RemoveContainer" containerID="562aa99333052a6aa858a2ec99d43d5f644c3b774a97f6c23c4bead504201f6f" Dec 09 15:12:44 crc kubenswrapper[4555]: E1209 15:12:44.536456 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"562aa99333052a6aa858a2ec99d43d5f644c3b774a97f6c23c4bead504201f6f\": container with ID starting with 562aa99333052a6aa858a2ec99d43d5f644c3b774a97f6c23c4bead504201f6f not found: ID does not exist" containerID="562aa99333052a6aa858a2ec99d43d5f644c3b774a97f6c23c4bead504201f6f" Dec 09 15:12:44 crc kubenswrapper[4555]: I1209 15:12:44.536509 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"562aa99333052a6aa858a2ec99d43d5f644c3b774a97f6c23c4bead504201f6f"} err="failed to get container status \"562aa99333052a6aa858a2ec99d43d5f644c3b774a97f6c23c4bead504201f6f\": rpc error: code = NotFound desc = could not find container \"562aa99333052a6aa858a2ec99d43d5f644c3b774a97f6c23c4bead504201f6f\": container with ID starting with 562aa99333052a6aa858a2ec99d43d5f644c3b774a97f6c23c4bead504201f6f not found: ID does not exist" Dec 09 15:12:44 crc kubenswrapper[4555]: I1209 15:12:44.536530 4555 scope.go:117] "RemoveContainer" containerID="6669ececb4bbd1b0a0d11d0aa89382eb7fd96d9c0104591a4dc04b032820593c" Dec 09 15:12:44 crc kubenswrapper[4555]: E1209 15:12:44.536803 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6669ececb4bbd1b0a0d11d0aa89382eb7fd96d9c0104591a4dc04b032820593c\": container with ID starting with 6669ececb4bbd1b0a0d11d0aa89382eb7fd96d9c0104591a4dc04b032820593c not found: ID does not exist" containerID="6669ececb4bbd1b0a0d11d0aa89382eb7fd96d9c0104591a4dc04b032820593c" Dec 09 15:12:44 crc kubenswrapper[4555]: I1209 15:12:44.536830 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6669ececb4bbd1b0a0d11d0aa89382eb7fd96d9c0104591a4dc04b032820593c"} err="failed to get container status \"6669ececb4bbd1b0a0d11d0aa89382eb7fd96d9c0104591a4dc04b032820593c\": rpc error: code = NotFound desc = could not find container \"6669ececb4bbd1b0a0d11d0aa89382eb7fd96d9c0104591a4dc04b032820593c\": container with ID starting with 6669ececb4bbd1b0a0d11d0aa89382eb7fd96d9c0104591a4dc04b032820593c not found: ID does not exist" Dec 09 15:12:46 crc kubenswrapper[4555]: I1209 15:12:46.173383 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef88759b-ff68-4af7-8067-0d2cbd27ecc6" path="/var/lib/kubelet/pods/ef88759b-ff68-4af7-8067-0d2cbd27ecc6/volumes" Dec 09 15:12:47 crc kubenswrapper[4555]: I1209 15:12:47.706408 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:12:55 crc kubenswrapper[4555]: I1209 15:12:55.531853 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:12:55 crc kubenswrapper[4555]: I1209 15:12:55.532218 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.711470 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-zbppq"] Dec 09 15:13:06 crc kubenswrapper[4555]: E1209 15:13:06.712217 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef88759b-ff68-4af7-8067-0d2cbd27ecc6" containerName="extract-content" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.712231 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef88759b-ff68-4af7-8067-0d2cbd27ecc6" containerName="extract-content" Dec 09 15:13:06 crc kubenswrapper[4555]: E1209 15:13:06.712247 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27f6e849-7d90-4c6a-8a8a-f5cb0692d86d" containerName="registry-server" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.712253 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f6e849-7d90-4c6a-8a8a-f5cb0692d86d" containerName="registry-server" Dec 09 15:13:06 crc kubenswrapper[4555]: E1209 15:13:06.712263 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef88759b-ff68-4af7-8067-0d2cbd27ecc6" containerName="extract-utilities" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.712269 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef88759b-ff68-4af7-8067-0d2cbd27ecc6" containerName="extract-utilities" Dec 09 15:13:06 crc kubenswrapper[4555]: E1209 15:13:06.712279 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27f6e849-7d90-4c6a-8a8a-f5cb0692d86d" containerName="extract-content" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.712284 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f6e849-7d90-4c6a-8a8a-f5cb0692d86d" containerName="extract-content" Dec 09 15:13:06 crc kubenswrapper[4555]: E1209 15:13:06.712327 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27f6e849-7d90-4c6a-8a8a-f5cb0692d86d" containerName="extract-utilities" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.712333 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f6e849-7d90-4c6a-8a8a-f5cb0692d86d" containerName="extract-utilities" Dec 09 15:13:06 crc kubenswrapper[4555]: E1209 15:13:06.712355 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef88759b-ff68-4af7-8067-0d2cbd27ecc6" containerName="registry-server" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.712360 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef88759b-ff68-4af7-8067-0d2cbd27ecc6" containerName="registry-server" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.713114 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="27f6e849-7d90-4c6a-8a8a-f5cb0692d86d" containerName="registry-server" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.713136 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef88759b-ff68-4af7-8067-0d2cbd27ecc6" containerName="registry-server" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.713820 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.719492 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.732887 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-r96vd" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.733195 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.733441 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.733681 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.737023 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.740043 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-zbppq"] Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.775801 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-zbppq"] Dec 09 15:13:06 crc kubenswrapper[4555]: E1209 15:13:06.776454 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-zc4p8 metrics sa-token tmp trusted-ca], unattached volumes=[], failed to process volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-zc4p8 metrics sa-token tmp trusted-ca]: context canceled" pod="openshift-logging/collector-zbppq" podUID="ae6b260f-1e75-4470-a796-bc4f0719136a" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.821396 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ae6b260f-1e75-4470-a796-bc4f0719136a-tmp\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.821468 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/ae6b260f-1e75-4470-a796-bc4f0719136a-collector-syslog-receiver\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.821486 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc4p8\" (UniqueName: \"kubernetes.io/projected/ae6b260f-1e75-4470-a796-bc4f0719136a-kube-api-access-zc4p8\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.821531 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/ae6b260f-1e75-4470-a796-bc4f0719136a-metrics\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.821547 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/ae6b260f-1e75-4470-a796-bc4f0719136a-entrypoint\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.821570 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/ae6b260f-1e75-4470-a796-bc4f0719136a-config-openshift-service-cacrt\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.821616 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/ae6b260f-1e75-4470-a796-bc4f0719136a-collector-token\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.821635 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ae6b260f-1e75-4470-a796-bc4f0719136a-trusted-ca\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.821652 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae6b260f-1e75-4470-a796-bc4f0719136a-config\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.821676 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/ae6b260f-1e75-4470-a796-bc4f0719136a-sa-token\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.821698 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/ae6b260f-1e75-4470-a796-bc4f0719136a-datadir\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.922944 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/ae6b260f-1e75-4470-a796-bc4f0719136a-collector-syslog-receiver\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.922983 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc4p8\" (UniqueName: \"kubernetes.io/projected/ae6b260f-1e75-4470-a796-bc4f0719136a-kube-api-access-zc4p8\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.923038 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/ae6b260f-1e75-4470-a796-bc4f0719136a-metrics\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.923056 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/ae6b260f-1e75-4470-a796-bc4f0719136a-entrypoint\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.923079 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/ae6b260f-1e75-4470-a796-bc4f0719136a-config-openshift-service-cacrt\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: E1209 15:13:06.923118 4555 secret.go:188] Couldn't get secret openshift-logging/collector-syslog-receiver: secret "collector-syslog-receiver" not found Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.923134 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/ae6b260f-1e75-4470-a796-bc4f0719136a-collector-token\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.923153 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ae6b260f-1e75-4470-a796-bc4f0719136a-trusted-ca\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: E1209 15:13:06.923176 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae6b260f-1e75-4470-a796-bc4f0719136a-collector-syslog-receiver podName:ae6b260f-1e75-4470-a796-bc4f0719136a nodeName:}" failed. No retries permitted until 2025-12-09 15:13:07.423160084 +0000 UTC m=+897.559872803 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "collector-syslog-receiver" (UniqueName: "kubernetes.io/secret/ae6b260f-1e75-4470-a796-bc4f0719136a-collector-syslog-receiver") pod "collector-zbppq" (UID: "ae6b260f-1e75-4470-a796-bc4f0719136a") : secret "collector-syslog-receiver" not found Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.923198 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae6b260f-1e75-4470-a796-bc4f0719136a-config\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.923246 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/ae6b260f-1e75-4470-a796-bc4f0719136a-sa-token\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.923282 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/ae6b260f-1e75-4470-a796-bc4f0719136a-datadir\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.923340 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ae6b260f-1e75-4470-a796-bc4f0719136a-tmp\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: E1209 15:13:06.923657 4555 secret.go:188] Couldn't get secret openshift-logging/collector-metrics: secret "collector-metrics" not found Dec 09 15:13:06 crc kubenswrapper[4555]: E1209 15:13:06.923706 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae6b260f-1e75-4470-a796-bc4f0719136a-metrics podName:ae6b260f-1e75-4470-a796-bc4f0719136a nodeName:}" failed. No retries permitted until 2025-12-09 15:13:07.423694903 +0000 UTC m=+897.560407620 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics" (UniqueName: "kubernetes.io/secret/ae6b260f-1e75-4470-a796-bc4f0719136a-metrics") pod "collector-zbppq" (UID: "ae6b260f-1e75-4470-a796-bc4f0719136a") : secret "collector-metrics" not found Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.923710 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/ae6b260f-1e75-4470-a796-bc4f0719136a-datadir\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.924044 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/ae6b260f-1e75-4470-a796-bc4f0719136a-entrypoint\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.924176 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/ae6b260f-1e75-4470-a796-bc4f0719136a-config-openshift-service-cacrt\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.924498 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ae6b260f-1e75-4470-a796-bc4f0719136a-trusted-ca\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.924622 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae6b260f-1e75-4470-a796-bc4f0719136a-config\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.933499 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ae6b260f-1e75-4470-a796-bc4f0719136a-tmp\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.933904 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/ae6b260f-1e75-4470-a796-bc4f0719136a-collector-token\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.938184 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/ae6b260f-1e75-4470-a796-bc4f0719136a-sa-token\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:06 crc kubenswrapper[4555]: I1209 15:13:06.938971 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc4p8\" (UniqueName: \"kubernetes.io/projected/ae6b260f-1e75-4470-a796-bc4f0719136a-kube-api-access-zc4p8\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.429638 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/ae6b260f-1e75-4470-a796-bc4f0719136a-collector-syslog-receiver\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.429721 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/ae6b260f-1e75-4470-a796-bc4f0719136a-metrics\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.432200 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/ae6b260f-1e75-4470-a796-bc4f0719136a-collector-syslog-receiver\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.432627 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/ae6b260f-1e75-4470-a796-bc4f0719136a-metrics\") pod \"collector-zbppq\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " pod="openshift-logging/collector-zbppq" Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.586741 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-zbppq" Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.593996 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-zbppq" Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.631602 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/ae6b260f-1e75-4470-a796-bc4f0719136a-config-openshift-service-cacrt\") pod \"ae6b260f-1e75-4470-a796-bc4f0719136a\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.631639 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/ae6b260f-1e75-4470-a796-bc4f0719136a-datadir\") pod \"ae6b260f-1e75-4470-a796-bc4f0719136a\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.631662 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae6b260f-1e75-4470-a796-bc4f0719136a-config\") pod \"ae6b260f-1e75-4470-a796-bc4f0719136a\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.631684 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ae6b260f-1e75-4470-a796-bc4f0719136a-trusted-ca\") pod \"ae6b260f-1e75-4470-a796-bc4f0719136a\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.631703 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ae6b260f-1e75-4470-a796-bc4f0719136a-tmp\") pod \"ae6b260f-1e75-4470-a796-bc4f0719136a\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.631715 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae6b260f-1e75-4470-a796-bc4f0719136a-datadir" (OuterVolumeSpecName: "datadir") pod "ae6b260f-1e75-4470-a796-bc4f0719136a" (UID: "ae6b260f-1e75-4470-a796-bc4f0719136a"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.631755 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/ae6b260f-1e75-4470-a796-bc4f0719136a-collector-token\") pod \"ae6b260f-1e75-4470-a796-bc4f0719136a\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.631775 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/ae6b260f-1e75-4470-a796-bc4f0719136a-sa-token\") pod \"ae6b260f-1e75-4470-a796-bc4f0719136a\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.632008 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae6b260f-1e75-4470-a796-bc4f0719136a-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "ae6b260f-1e75-4470-a796-bc4f0719136a" (UID: "ae6b260f-1e75-4470-a796-bc4f0719136a"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.632036 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae6b260f-1e75-4470-a796-bc4f0719136a-config" (OuterVolumeSpecName: "config") pod "ae6b260f-1e75-4470-a796-bc4f0719136a" (UID: "ae6b260f-1e75-4470-a796-bc4f0719136a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.632121 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/ae6b260f-1e75-4470-a796-bc4f0719136a-collector-syslog-receiver\") pod \"ae6b260f-1e75-4470-a796-bc4f0719136a\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.632201 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc4p8\" (UniqueName: \"kubernetes.io/projected/ae6b260f-1e75-4470-a796-bc4f0719136a-kube-api-access-zc4p8\") pod \"ae6b260f-1e75-4470-a796-bc4f0719136a\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.632286 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae6b260f-1e75-4470-a796-bc4f0719136a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "ae6b260f-1e75-4470-a796-bc4f0719136a" (UID: "ae6b260f-1e75-4470-a796-bc4f0719136a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.632322 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/ae6b260f-1e75-4470-a796-bc4f0719136a-metrics\") pod \"ae6b260f-1e75-4470-a796-bc4f0719136a\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.632452 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/ae6b260f-1e75-4470-a796-bc4f0719136a-entrypoint\") pod \"ae6b260f-1e75-4470-a796-bc4f0719136a\" (UID: \"ae6b260f-1e75-4470-a796-bc4f0719136a\") " Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.632866 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae6b260f-1e75-4470-a796-bc4f0719136a-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "ae6b260f-1e75-4470-a796-bc4f0719136a" (UID: "ae6b260f-1e75-4470-a796-bc4f0719136a"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.633020 4555 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ae6b260f-1e75-4470-a796-bc4f0719136a-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.633039 4555 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/ae6b260f-1e75-4470-a796-bc4f0719136a-entrypoint\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.633048 4555 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/ae6b260f-1e75-4470-a796-bc4f0719136a-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.633060 4555 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/ae6b260f-1e75-4470-a796-bc4f0719136a-datadir\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.633068 4555 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae6b260f-1e75-4470-a796-bc4f0719136a-config\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.634128 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae6b260f-1e75-4470-a796-bc4f0719136a-sa-token" (OuterVolumeSpecName: "sa-token") pod "ae6b260f-1e75-4470-a796-bc4f0719136a" (UID: "ae6b260f-1e75-4470-a796-bc4f0719136a"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.634455 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae6b260f-1e75-4470-a796-bc4f0719136a-collector-token" (OuterVolumeSpecName: "collector-token") pod "ae6b260f-1e75-4470-a796-bc4f0719136a" (UID: "ae6b260f-1e75-4470-a796-bc4f0719136a"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.634819 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae6b260f-1e75-4470-a796-bc4f0719136a-metrics" (OuterVolumeSpecName: "metrics") pod "ae6b260f-1e75-4470-a796-bc4f0719136a" (UID: "ae6b260f-1e75-4470-a796-bc4f0719136a"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.635266 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae6b260f-1e75-4470-a796-bc4f0719136a-tmp" (OuterVolumeSpecName: "tmp") pod "ae6b260f-1e75-4470-a796-bc4f0719136a" (UID: "ae6b260f-1e75-4470-a796-bc4f0719136a"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.635284 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae6b260f-1e75-4470-a796-bc4f0719136a-kube-api-access-zc4p8" (OuterVolumeSpecName: "kube-api-access-zc4p8") pod "ae6b260f-1e75-4470-a796-bc4f0719136a" (UID: "ae6b260f-1e75-4470-a796-bc4f0719136a"). InnerVolumeSpecName "kube-api-access-zc4p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.636158 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae6b260f-1e75-4470-a796-bc4f0719136a-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "ae6b260f-1e75-4470-a796-bc4f0719136a" (UID: "ae6b260f-1e75-4470-a796-bc4f0719136a"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.733955 4555 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/ae6b260f-1e75-4470-a796-bc4f0719136a-collector-token\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.733981 4555 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/ae6b260f-1e75-4470-a796-bc4f0719136a-sa-token\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.733990 4555 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/ae6b260f-1e75-4470-a796-bc4f0719136a-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.734000 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc4p8\" (UniqueName: \"kubernetes.io/projected/ae6b260f-1e75-4470-a796-bc4f0719136a-kube-api-access-zc4p8\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.734008 4555 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/ae6b260f-1e75-4470-a796-bc4f0719136a-metrics\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:07 crc kubenswrapper[4555]: I1209 15:13:07.734016 4555 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ae6b260f-1e75-4470-a796-bc4f0719136a-tmp\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.591157 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-zbppq" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.617330 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-zbppq"] Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.622068 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-zbppq"] Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.625373 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-qt8j8"] Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.626162 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.627801 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.627988 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.628538 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-r96vd" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.629016 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.629922 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.633762 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.634825 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-qt8j8"] Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.645609 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-entrypoint\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.645735 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-trusted-ca\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.645790 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-config\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.645842 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-datadir\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.645964 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-collector-token\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.646002 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-collector-syslog-receiver\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.646043 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xr48\" (UniqueName: \"kubernetes.io/projected/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-kube-api-access-8xr48\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.646073 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-tmp\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.646105 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-config-openshift-service-cacrt\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.646154 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-sa-token\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.646194 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-metrics\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.747543 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-trusted-ca\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.747580 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-config\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.747604 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-datadir\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.747655 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-collector-token\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.747680 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-collector-syslog-receiver\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.747701 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xr48\" (UniqueName: \"kubernetes.io/projected/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-kube-api-access-8xr48\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.747706 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-datadir\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.747723 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-tmp\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.747746 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-config-openshift-service-cacrt\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.747770 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-sa-token\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.747797 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-metrics\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.747820 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-entrypoint\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.748372 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-trusted-ca\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.749081 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-entrypoint\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.749163 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-config\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.749638 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-config-openshift-service-cacrt\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.750991 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-tmp\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.751242 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-collector-syslog-receiver\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.752024 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-metrics\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.752633 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-collector-token\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.760672 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xr48\" (UniqueName: \"kubernetes.io/projected/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-kube-api-access-8xr48\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.760729 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/8dcb0d25-363c-47cb-9173-49dad4a2fe5d-sa-token\") pod \"collector-qt8j8\" (UID: \"8dcb0d25-363c-47cb-9173-49dad4a2fe5d\") " pod="openshift-logging/collector-qt8j8" Dec 09 15:13:08 crc kubenswrapper[4555]: I1209 15:13:08.941279 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-qt8j8" Dec 09 15:13:09 crc kubenswrapper[4555]: I1209 15:13:09.278214 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-qt8j8"] Dec 09 15:13:09 crc kubenswrapper[4555]: I1209 15:13:09.604991 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-qt8j8" event={"ID":"8dcb0d25-363c-47cb-9173-49dad4a2fe5d","Type":"ContainerStarted","Data":"52023b3251db18cb3d9a2ed8b043f2f7cb56e438222a276b315c9043325ff4f7"} Dec 09 15:13:10 crc kubenswrapper[4555]: I1209 15:13:10.166743 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae6b260f-1e75-4470-a796-bc4f0719136a" path="/var/lib/kubelet/pods/ae6b260f-1e75-4470-a796-bc4f0719136a/volumes" Dec 09 15:13:15 crc kubenswrapper[4555]: I1209 15:13:15.648161 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-qt8j8" event={"ID":"8dcb0d25-363c-47cb-9173-49dad4a2fe5d","Type":"ContainerStarted","Data":"7b526b62abc945844056727ebda20a6d9c8f6749ed06149d714df7520040dbe1"} Dec 09 15:13:15 crc kubenswrapper[4555]: I1209 15:13:15.663037 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-qt8j8" podStartSLOduration=2.285452062 podStartE2EDuration="7.663022533s" podCreationTimestamp="2025-12-09 15:13:08 +0000 UTC" firstStartedPulling="2025-12-09 15:13:09.283968326 +0000 UTC m=+899.420681043" lastFinishedPulling="2025-12-09 15:13:14.661538795 +0000 UTC m=+904.798251514" observedRunningTime="2025-12-09 15:13:15.660203247 +0000 UTC m=+905.796915964" watchObservedRunningTime="2025-12-09 15:13:15.663022533 +0000 UTC m=+905.799735252" Dec 09 15:13:25 crc kubenswrapper[4555]: I1209 15:13:25.531430 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:13:25 crc kubenswrapper[4555]: I1209 15:13:25.531645 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:13:39 crc kubenswrapper[4555]: I1209 15:13:39.279932 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2"] Dec 09 15:13:39 crc kubenswrapper[4555]: I1209 15:13:39.282342 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2" Dec 09 15:13:39 crc kubenswrapper[4555]: I1209 15:13:39.284824 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 09 15:13:39 crc kubenswrapper[4555]: I1209 15:13:39.296566 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2"] Dec 09 15:13:39 crc kubenswrapper[4555]: I1209 15:13:39.328017 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b68f60b-a6f6-4057-b219-fc436349e094-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2\" (UID: \"7b68f60b-a6f6-4057-b219-fc436349e094\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2" Dec 09 15:13:39 crc kubenswrapper[4555]: I1209 15:13:39.328060 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b68f60b-a6f6-4057-b219-fc436349e094-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2\" (UID: \"7b68f60b-a6f6-4057-b219-fc436349e094\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2" Dec 09 15:13:39 crc kubenswrapper[4555]: I1209 15:13:39.328241 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mckqs\" (UniqueName: \"kubernetes.io/projected/7b68f60b-a6f6-4057-b219-fc436349e094-kube-api-access-mckqs\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2\" (UID: \"7b68f60b-a6f6-4057-b219-fc436349e094\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2" Dec 09 15:13:39 crc kubenswrapper[4555]: I1209 15:13:39.429290 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mckqs\" (UniqueName: \"kubernetes.io/projected/7b68f60b-a6f6-4057-b219-fc436349e094-kube-api-access-mckqs\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2\" (UID: \"7b68f60b-a6f6-4057-b219-fc436349e094\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2" Dec 09 15:13:39 crc kubenswrapper[4555]: I1209 15:13:39.429416 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b68f60b-a6f6-4057-b219-fc436349e094-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2\" (UID: \"7b68f60b-a6f6-4057-b219-fc436349e094\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2" Dec 09 15:13:39 crc kubenswrapper[4555]: I1209 15:13:39.429441 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b68f60b-a6f6-4057-b219-fc436349e094-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2\" (UID: \"7b68f60b-a6f6-4057-b219-fc436349e094\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2" Dec 09 15:13:39 crc kubenswrapper[4555]: I1209 15:13:39.429877 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b68f60b-a6f6-4057-b219-fc436349e094-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2\" (UID: \"7b68f60b-a6f6-4057-b219-fc436349e094\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2" Dec 09 15:13:39 crc kubenswrapper[4555]: I1209 15:13:39.430101 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b68f60b-a6f6-4057-b219-fc436349e094-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2\" (UID: \"7b68f60b-a6f6-4057-b219-fc436349e094\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2" Dec 09 15:13:39 crc kubenswrapper[4555]: I1209 15:13:39.444913 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mckqs\" (UniqueName: \"kubernetes.io/projected/7b68f60b-a6f6-4057-b219-fc436349e094-kube-api-access-mckqs\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2\" (UID: \"7b68f60b-a6f6-4057-b219-fc436349e094\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2" Dec 09 15:13:39 crc kubenswrapper[4555]: I1209 15:13:39.595082 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2" Dec 09 15:13:39 crc kubenswrapper[4555]: I1209 15:13:39.956325 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2"] Dec 09 15:13:40 crc kubenswrapper[4555]: I1209 15:13:40.771258 4555 generic.go:334] "Generic (PLEG): container finished" podID="7b68f60b-a6f6-4057-b219-fc436349e094" containerID="0ddc3baf002345c7bf426392f543900a52d0ce009d042d8f0d4242bc31efa30a" exitCode=0 Dec 09 15:13:40 crc kubenswrapper[4555]: I1209 15:13:40.771296 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2" event={"ID":"7b68f60b-a6f6-4057-b219-fc436349e094","Type":"ContainerDied","Data":"0ddc3baf002345c7bf426392f543900a52d0ce009d042d8f0d4242bc31efa30a"} Dec 09 15:13:40 crc kubenswrapper[4555]: I1209 15:13:40.771345 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2" event={"ID":"7b68f60b-a6f6-4057-b219-fc436349e094","Type":"ContainerStarted","Data":"2073e2191e3a5e05cdf2e614527a40d309cb17f11cd60a53522887d5f302e0db"} Dec 09 15:13:42 crc kubenswrapper[4555]: I1209 15:13:42.782702 4555 generic.go:334] "Generic (PLEG): container finished" podID="7b68f60b-a6f6-4057-b219-fc436349e094" containerID="e3a9c4f6670499c2bb86254e5eff02983b07610ab12ee1ae0ec580ed2c38b036" exitCode=0 Dec 09 15:13:42 crc kubenswrapper[4555]: I1209 15:13:42.782747 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2" event={"ID":"7b68f60b-a6f6-4057-b219-fc436349e094","Type":"ContainerDied","Data":"e3a9c4f6670499c2bb86254e5eff02983b07610ab12ee1ae0ec580ed2c38b036"} Dec 09 15:13:43 crc kubenswrapper[4555]: I1209 15:13:43.790052 4555 generic.go:334] "Generic (PLEG): container finished" podID="7b68f60b-a6f6-4057-b219-fc436349e094" containerID="db601b2e59e7854eafa78d4283854c937d474685061e0a6a6eed9743d493641b" exitCode=0 Dec 09 15:13:43 crc kubenswrapper[4555]: I1209 15:13:43.790146 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2" event={"ID":"7b68f60b-a6f6-4057-b219-fc436349e094","Type":"ContainerDied","Data":"db601b2e59e7854eafa78d4283854c937d474685061e0a6a6eed9743d493641b"} Dec 09 15:13:45 crc kubenswrapper[4555]: I1209 15:13:45.007176 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2" Dec 09 15:13:45 crc kubenswrapper[4555]: I1209 15:13:45.102190 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b68f60b-a6f6-4057-b219-fc436349e094-bundle\") pod \"7b68f60b-a6f6-4057-b219-fc436349e094\" (UID: \"7b68f60b-a6f6-4057-b219-fc436349e094\") " Dec 09 15:13:45 crc kubenswrapper[4555]: I1209 15:13:45.102285 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mckqs\" (UniqueName: \"kubernetes.io/projected/7b68f60b-a6f6-4057-b219-fc436349e094-kube-api-access-mckqs\") pod \"7b68f60b-a6f6-4057-b219-fc436349e094\" (UID: \"7b68f60b-a6f6-4057-b219-fc436349e094\") " Dec 09 15:13:45 crc kubenswrapper[4555]: I1209 15:13:45.102438 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b68f60b-a6f6-4057-b219-fc436349e094-util\") pod \"7b68f60b-a6f6-4057-b219-fc436349e094\" (UID: \"7b68f60b-a6f6-4057-b219-fc436349e094\") " Dec 09 15:13:45 crc kubenswrapper[4555]: I1209 15:13:45.103113 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b68f60b-a6f6-4057-b219-fc436349e094-bundle" (OuterVolumeSpecName: "bundle") pod "7b68f60b-a6f6-4057-b219-fc436349e094" (UID: "7b68f60b-a6f6-4057-b219-fc436349e094"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:13:45 crc kubenswrapper[4555]: I1209 15:13:45.106641 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b68f60b-a6f6-4057-b219-fc436349e094-kube-api-access-mckqs" (OuterVolumeSpecName: "kube-api-access-mckqs") pod "7b68f60b-a6f6-4057-b219-fc436349e094" (UID: "7b68f60b-a6f6-4057-b219-fc436349e094"). InnerVolumeSpecName "kube-api-access-mckqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:13:45 crc kubenswrapper[4555]: I1209 15:13:45.112224 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b68f60b-a6f6-4057-b219-fc436349e094-util" (OuterVolumeSpecName: "util") pod "7b68f60b-a6f6-4057-b219-fc436349e094" (UID: "7b68f60b-a6f6-4057-b219-fc436349e094"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:13:45 crc kubenswrapper[4555]: I1209 15:13:45.204582 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mckqs\" (UniqueName: \"kubernetes.io/projected/7b68f60b-a6f6-4057-b219-fc436349e094-kube-api-access-mckqs\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:45 crc kubenswrapper[4555]: I1209 15:13:45.204752 4555 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b68f60b-a6f6-4057-b219-fc436349e094-util\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:45 crc kubenswrapper[4555]: I1209 15:13:45.204819 4555 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b68f60b-a6f6-4057-b219-fc436349e094-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:45 crc kubenswrapper[4555]: I1209 15:13:45.801866 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2" event={"ID":"7b68f60b-a6f6-4057-b219-fc436349e094","Type":"ContainerDied","Data":"2073e2191e3a5e05cdf2e614527a40d309cb17f11cd60a53522887d5f302e0db"} Dec 09 15:13:45 crc kubenswrapper[4555]: I1209 15:13:45.802095 4555 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2073e2191e3a5e05cdf2e614527a40d309cb17f11cd60a53522887d5f302e0db" Dec 09 15:13:45 crc kubenswrapper[4555]: I1209 15:13:45.801968 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2" Dec 09 15:13:48 crc kubenswrapper[4555]: I1209 15:13:48.834368 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-q5d4p"] Dec 09 15:13:48 crc kubenswrapper[4555]: E1209 15:13:48.834753 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b68f60b-a6f6-4057-b219-fc436349e094" containerName="extract" Dec 09 15:13:48 crc kubenswrapper[4555]: I1209 15:13:48.834766 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b68f60b-a6f6-4057-b219-fc436349e094" containerName="extract" Dec 09 15:13:48 crc kubenswrapper[4555]: E1209 15:13:48.834797 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b68f60b-a6f6-4057-b219-fc436349e094" containerName="util" Dec 09 15:13:48 crc kubenswrapper[4555]: I1209 15:13:48.834802 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b68f60b-a6f6-4057-b219-fc436349e094" containerName="util" Dec 09 15:13:48 crc kubenswrapper[4555]: E1209 15:13:48.834807 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b68f60b-a6f6-4057-b219-fc436349e094" containerName="pull" Dec 09 15:13:48 crc kubenswrapper[4555]: I1209 15:13:48.834813 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b68f60b-a6f6-4057-b219-fc436349e094" containerName="pull" Dec 09 15:13:48 crc kubenswrapper[4555]: I1209 15:13:48.834932 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b68f60b-a6f6-4057-b219-fc436349e094" containerName="extract" Dec 09 15:13:48 crc kubenswrapper[4555]: I1209 15:13:48.835351 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-q5d4p" Dec 09 15:13:48 crc kubenswrapper[4555]: I1209 15:13:48.837673 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 09 15:13:48 crc kubenswrapper[4555]: I1209 15:13:48.837886 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-qgqrk" Dec 09 15:13:48 crc kubenswrapper[4555]: I1209 15:13:48.838027 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 09 15:13:48 crc kubenswrapper[4555]: I1209 15:13:48.842244 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-q5d4p"] Dec 09 15:13:48 crc kubenswrapper[4555]: I1209 15:13:48.954047 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8bgz\" (UniqueName: \"kubernetes.io/projected/759b2d4d-b483-4fcf-8f2a-f5893d685d51-kube-api-access-l8bgz\") pod \"nmstate-operator-5b5b58f5c8-q5d4p\" (UID: \"759b2d4d-b483-4fcf-8f2a-f5893d685d51\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-q5d4p" Dec 09 15:13:49 crc kubenswrapper[4555]: I1209 15:13:49.056142 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8bgz\" (UniqueName: \"kubernetes.io/projected/759b2d4d-b483-4fcf-8f2a-f5893d685d51-kube-api-access-l8bgz\") pod \"nmstate-operator-5b5b58f5c8-q5d4p\" (UID: \"759b2d4d-b483-4fcf-8f2a-f5893d685d51\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-q5d4p" Dec 09 15:13:49 crc kubenswrapper[4555]: I1209 15:13:49.071851 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8bgz\" (UniqueName: \"kubernetes.io/projected/759b2d4d-b483-4fcf-8f2a-f5893d685d51-kube-api-access-l8bgz\") pod \"nmstate-operator-5b5b58f5c8-q5d4p\" (UID: \"759b2d4d-b483-4fcf-8f2a-f5893d685d51\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-q5d4p" Dec 09 15:13:49 crc kubenswrapper[4555]: I1209 15:13:49.156367 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-q5d4p" Dec 09 15:13:49 crc kubenswrapper[4555]: I1209 15:13:49.546944 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-q5d4p"] Dec 09 15:13:49 crc kubenswrapper[4555]: I1209 15:13:49.822867 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-q5d4p" event={"ID":"759b2d4d-b483-4fcf-8f2a-f5893d685d51","Type":"ContainerStarted","Data":"78ecc3c8cb8015c09227d4f554cdeeb4028e819c5408fb6e8772ca2e3e3b28fd"} Dec 09 15:13:51 crc kubenswrapper[4555]: I1209 15:13:51.834489 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-q5d4p" event={"ID":"759b2d4d-b483-4fcf-8f2a-f5893d685d51","Type":"ContainerStarted","Data":"aedb267c243a082159735d494cd252351eb492f5100c5a91d05284e7810dac99"} Dec 09 15:13:51 crc kubenswrapper[4555]: I1209 15:13:51.846703 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-q5d4p" podStartSLOduration=1.838345322 podStartE2EDuration="3.846689389s" podCreationTimestamp="2025-12-09 15:13:48 +0000 UTC" firstStartedPulling="2025-12-09 15:13:49.553146211 +0000 UTC m=+939.689858929" lastFinishedPulling="2025-12-09 15:13:51.561490278 +0000 UTC m=+941.698202996" observedRunningTime="2025-12-09 15:13:51.844952845 +0000 UTC m=+941.981665562" watchObservedRunningTime="2025-12-09 15:13:51.846689389 +0000 UTC m=+941.983402108" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.620589 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-phn8q"] Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.637820 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-phn8q"] Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.637926 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-phn8q" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.643340 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-t6lcd"] Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.643500 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-vtf4l" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.644502 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-t6lcd" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.649163 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-pdmhs"] Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.657988 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-pdmhs"] Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.658064 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-pdmhs" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.663240 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.712664 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r4dq\" (UniqueName: \"kubernetes.io/projected/06ff1c84-8959-4b54-9fea-12a6d8ed97d5-kube-api-access-7r4dq\") pod \"nmstate-webhook-5f6d4c5ccb-pdmhs\" (UID: \"06ff1c84-8959-4b54-9fea-12a6d8ed97d5\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-pdmhs" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.712723 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf9bl\" (UniqueName: \"kubernetes.io/projected/41e8b8f7-9b28-46f9-b474-9d519d0e7875-kube-api-access-nf9bl\") pod \"nmstate-metrics-7f946cbc9-phn8q\" (UID: \"41e8b8f7-9b28-46f9-b474-9d519d0e7875\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-phn8q" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.712758 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/96ba4ce4-cba2-47a3-b353-e8f3daedfa5a-nmstate-lock\") pod \"nmstate-handler-t6lcd\" (UID: \"96ba4ce4-cba2-47a3-b353-e8f3daedfa5a\") " pod="openshift-nmstate/nmstate-handler-t6lcd" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.712773 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfnv8\" (UniqueName: \"kubernetes.io/projected/96ba4ce4-cba2-47a3-b353-e8f3daedfa5a-kube-api-access-pfnv8\") pod \"nmstate-handler-t6lcd\" (UID: \"96ba4ce4-cba2-47a3-b353-e8f3daedfa5a\") " pod="openshift-nmstate/nmstate-handler-t6lcd" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.712838 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/06ff1c84-8959-4b54-9fea-12a6d8ed97d5-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-pdmhs\" (UID: \"06ff1c84-8959-4b54-9fea-12a6d8ed97d5\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-pdmhs" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.712868 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/96ba4ce4-cba2-47a3-b353-e8f3daedfa5a-dbus-socket\") pod \"nmstate-handler-t6lcd\" (UID: \"96ba4ce4-cba2-47a3-b353-e8f3daedfa5a\") " pod="openshift-nmstate/nmstate-handler-t6lcd" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.712945 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/96ba4ce4-cba2-47a3-b353-e8f3daedfa5a-ovs-socket\") pod \"nmstate-handler-t6lcd\" (UID: \"96ba4ce4-cba2-47a3-b353-e8f3daedfa5a\") " pod="openshift-nmstate/nmstate-handler-t6lcd" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.742150 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kcpqb"] Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.742945 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kcpqb" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.745045 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-mbnkv" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.745220 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.745359 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.762208 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kcpqb"] Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.813649 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/96ba4ce4-cba2-47a3-b353-e8f3daedfa5a-ovs-socket\") pod \"nmstate-handler-t6lcd\" (UID: \"96ba4ce4-cba2-47a3-b353-e8f3daedfa5a\") " pod="openshift-nmstate/nmstate-handler-t6lcd" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.813711 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r4dq\" (UniqueName: \"kubernetes.io/projected/06ff1c84-8959-4b54-9fea-12a6d8ed97d5-kube-api-access-7r4dq\") pod \"nmstate-webhook-5f6d4c5ccb-pdmhs\" (UID: \"06ff1c84-8959-4b54-9fea-12a6d8ed97d5\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-pdmhs" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.813742 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf9bl\" (UniqueName: \"kubernetes.io/projected/41e8b8f7-9b28-46f9-b474-9d519d0e7875-kube-api-access-nf9bl\") pod \"nmstate-metrics-7f946cbc9-phn8q\" (UID: \"41e8b8f7-9b28-46f9-b474-9d519d0e7875\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-phn8q" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.813744 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/96ba4ce4-cba2-47a3-b353-e8f3daedfa5a-ovs-socket\") pod \"nmstate-handler-t6lcd\" (UID: \"96ba4ce4-cba2-47a3-b353-e8f3daedfa5a\") " pod="openshift-nmstate/nmstate-handler-t6lcd" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.813766 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/96ba4ce4-cba2-47a3-b353-e8f3daedfa5a-nmstate-lock\") pod \"nmstate-handler-t6lcd\" (UID: \"96ba4ce4-cba2-47a3-b353-e8f3daedfa5a\") " pod="openshift-nmstate/nmstate-handler-t6lcd" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.813782 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfnv8\" (UniqueName: \"kubernetes.io/projected/96ba4ce4-cba2-47a3-b353-e8f3daedfa5a-kube-api-access-pfnv8\") pod \"nmstate-handler-t6lcd\" (UID: \"96ba4ce4-cba2-47a3-b353-e8f3daedfa5a\") " pod="openshift-nmstate/nmstate-handler-t6lcd" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.813833 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcn4z\" (UniqueName: \"kubernetes.io/projected/41bb2d5e-b40e-4088-9ffa-ac0234d7050c-kube-api-access-fcn4z\") pod \"nmstate-console-plugin-7fbb5f6569-kcpqb\" (UID: \"41bb2d5e-b40e-4088-9ffa-ac0234d7050c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kcpqb" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.813851 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/06ff1c84-8959-4b54-9fea-12a6d8ed97d5-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-pdmhs\" (UID: \"06ff1c84-8959-4b54-9fea-12a6d8ed97d5\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-pdmhs" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.813875 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/96ba4ce4-cba2-47a3-b353-e8f3daedfa5a-dbus-socket\") pod \"nmstate-handler-t6lcd\" (UID: \"96ba4ce4-cba2-47a3-b353-e8f3daedfa5a\") " pod="openshift-nmstate/nmstate-handler-t6lcd" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.813892 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/41bb2d5e-b40e-4088-9ffa-ac0234d7050c-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-kcpqb\" (UID: \"41bb2d5e-b40e-4088-9ffa-ac0234d7050c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kcpqb" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.813920 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/41bb2d5e-b40e-4088-9ffa-ac0234d7050c-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-kcpqb\" (UID: \"41bb2d5e-b40e-4088-9ffa-ac0234d7050c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kcpqb" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.814009 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/96ba4ce4-cba2-47a3-b353-e8f3daedfa5a-nmstate-lock\") pod \"nmstate-handler-t6lcd\" (UID: \"96ba4ce4-cba2-47a3-b353-e8f3daedfa5a\") " pod="openshift-nmstate/nmstate-handler-t6lcd" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.814371 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/96ba4ce4-cba2-47a3-b353-e8f3daedfa5a-dbus-socket\") pod \"nmstate-handler-t6lcd\" (UID: \"96ba4ce4-cba2-47a3-b353-e8f3daedfa5a\") " pod="openshift-nmstate/nmstate-handler-t6lcd" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.819110 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/06ff1c84-8959-4b54-9fea-12a6d8ed97d5-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-pdmhs\" (UID: \"06ff1c84-8959-4b54-9fea-12a6d8ed97d5\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-pdmhs" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.828866 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfnv8\" (UniqueName: \"kubernetes.io/projected/96ba4ce4-cba2-47a3-b353-e8f3daedfa5a-kube-api-access-pfnv8\") pod \"nmstate-handler-t6lcd\" (UID: \"96ba4ce4-cba2-47a3-b353-e8f3daedfa5a\") " pod="openshift-nmstate/nmstate-handler-t6lcd" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.834679 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf9bl\" (UniqueName: \"kubernetes.io/projected/41e8b8f7-9b28-46f9-b474-9d519d0e7875-kube-api-access-nf9bl\") pod \"nmstate-metrics-7f946cbc9-phn8q\" (UID: \"41e8b8f7-9b28-46f9-b474-9d519d0e7875\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-phn8q" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.839684 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r4dq\" (UniqueName: \"kubernetes.io/projected/06ff1c84-8959-4b54-9fea-12a6d8ed97d5-kube-api-access-7r4dq\") pod \"nmstate-webhook-5f6d4c5ccb-pdmhs\" (UID: \"06ff1c84-8959-4b54-9fea-12a6d8ed97d5\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-pdmhs" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.900715 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-568d4776df-zlwzf"] Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.901489 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-568d4776df-zlwzf" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.912691 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-568d4776df-zlwzf"] Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.914689 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcn4z\" (UniqueName: \"kubernetes.io/projected/41bb2d5e-b40e-4088-9ffa-ac0234d7050c-kube-api-access-fcn4z\") pod \"nmstate-console-plugin-7fbb5f6569-kcpqb\" (UID: \"41bb2d5e-b40e-4088-9ffa-ac0234d7050c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kcpqb" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.914745 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/41bb2d5e-b40e-4088-9ffa-ac0234d7050c-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-kcpqb\" (UID: \"41bb2d5e-b40e-4088-9ffa-ac0234d7050c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kcpqb" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.914770 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/41bb2d5e-b40e-4088-9ffa-ac0234d7050c-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-kcpqb\" (UID: \"41bb2d5e-b40e-4088-9ffa-ac0234d7050c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kcpqb" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.915683 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/41bb2d5e-b40e-4088-9ffa-ac0234d7050c-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-kcpqb\" (UID: \"41bb2d5e-b40e-4088-9ffa-ac0234d7050c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kcpqb" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.930929 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/41bb2d5e-b40e-4088-9ffa-ac0234d7050c-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-kcpqb\" (UID: \"41bb2d5e-b40e-4088-9ffa-ac0234d7050c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kcpqb" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.938051 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcn4z\" (UniqueName: \"kubernetes.io/projected/41bb2d5e-b40e-4088-9ffa-ac0234d7050c-kube-api-access-fcn4z\") pod \"nmstate-console-plugin-7fbb5f6569-kcpqb\" (UID: \"41bb2d5e-b40e-4088-9ffa-ac0234d7050c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kcpqb" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.957450 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-phn8q" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.978497 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-t6lcd" Dec 09 15:13:52 crc kubenswrapper[4555]: I1209 15:13:52.983587 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-pdmhs" Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.016179 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/135026f5-8973-42fc-9597-b0193a479da5-console-oauth-config\") pod \"console-568d4776df-zlwzf\" (UID: \"135026f5-8973-42fc-9597-b0193a479da5\") " pod="openshift-console/console-568d4776df-zlwzf" Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.016224 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt5pz\" (UniqueName: \"kubernetes.io/projected/135026f5-8973-42fc-9597-b0193a479da5-kube-api-access-lt5pz\") pod \"console-568d4776df-zlwzf\" (UID: \"135026f5-8973-42fc-9597-b0193a479da5\") " pod="openshift-console/console-568d4776df-zlwzf" Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.016261 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/135026f5-8973-42fc-9597-b0193a479da5-service-ca\") pod \"console-568d4776df-zlwzf\" (UID: \"135026f5-8973-42fc-9597-b0193a479da5\") " pod="openshift-console/console-568d4776df-zlwzf" Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.016285 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/135026f5-8973-42fc-9597-b0193a479da5-trusted-ca-bundle\") pod \"console-568d4776df-zlwzf\" (UID: \"135026f5-8973-42fc-9597-b0193a479da5\") " pod="openshift-console/console-568d4776df-zlwzf" Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.016336 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/135026f5-8973-42fc-9597-b0193a479da5-oauth-serving-cert\") pod \"console-568d4776df-zlwzf\" (UID: \"135026f5-8973-42fc-9597-b0193a479da5\") " pod="openshift-console/console-568d4776df-zlwzf" Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.016358 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/135026f5-8973-42fc-9597-b0193a479da5-console-serving-cert\") pod \"console-568d4776df-zlwzf\" (UID: \"135026f5-8973-42fc-9597-b0193a479da5\") " pod="openshift-console/console-568d4776df-zlwzf" Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.016376 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/135026f5-8973-42fc-9597-b0193a479da5-console-config\") pod \"console-568d4776df-zlwzf\" (UID: \"135026f5-8973-42fc-9597-b0193a479da5\") " pod="openshift-console/console-568d4776df-zlwzf" Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.057603 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kcpqb" Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.117555 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/135026f5-8973-42fc-9597-b0193a479da5-console-oauth-config\") pod \"console-568d4776df-zlwzf\" (UID: \"135026f5-8973-42fc-9597-b0193a479da5\") " pod="openshift-console/console-568d4776df-zlwzf" Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.117601 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt5pz\" (UniqueName: \"kubernetes.io/projected/135026f5-8973-42fc-9597-b0193a479da5-kube-api-access-lt5pz\") pod \"console-568d4776df-zlwzf\" (UID: \"135026f5-8973-42fc-9597-b0193a479da5\") " pod="openshift-console/console-568d4776df-zlwzf" Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.117689 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/135026f5-8973-42fc-9597-b0193a479da5-service-ca\") pod \"console-568d4776df-zlwzf\" (UID: \"135026f5-8973-42fc-9597-b0193a479da5\") " pod="openshift-console/console-568d4776df-zlwzf" Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.117995 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/135026f5-8973-42fc-9597-b0193a479da5-trusted-ca-bundle\") pod \"console-568d4776df-zlwzf\" (UID: \"135026f5-8973-42fc-9597-b0193a479da5\") " pod="openshift-console/console-568d4776df-zlwzf" Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.118502 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/135026f5-8973-42fc-9597-b0193a479da5-service-ca\") pod \"console-568d4776df-zlwzf\" (UID: \"135026f5-8973-42fc-9597-b0193a479da5\") " pod="openshift-console/console-568d4776df-zlwzf" Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.118772 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/135026f5-8973-42fc-9597-b0193a479da5-trusted-ca-bundle\") pod \"console-568d4776df-zlwzf\" (UID: \"135026f5-8973-42fc-9597-b0193a479da5\") " pod="openshift-console/console-568d4776df-zlwzf" Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.118873 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/135026f5-8973-42fc-9597-b0193a479da5-oauth-serving-cert\") pod \"console-568d4776df-zlwzf\" (UID: \"135026f5-8973-42fc-9597-b0193a479da5\") " pod="openshift-console/console-568d4776df-zlwzf" Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.119403 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/135026f5-8973-42fc-9597-b0193a479da5-oauth-serving-cert\") pod \"console-568d4776df-zlwzf\" (UID: \"135026f5-8973-42fc-9597-b0193a479da5\") " pod="openshift-console/console-568d4776df-zlwzf" Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.119438 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/135026f5-8973-42fc-9597-b0193a479da5-console-serving-cert\") pod \"console-568d4776df-zlwzf\" (UID: \"135026f5-8973-42fc-9597-b0193a479da5\") " pod="openshift-console/console-568d4776df-zlwzf" Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.119464 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/135026f5-8973-42fc-9597-b0193a479da5-console-config\") pod \"console-568d4776df-zlwzf\" (UID: \"135026f5-8973-42fc-9597-b0193a479da5\") " pod="openshift-console/console-568d4776df-zlwzf" Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.120296 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/135026f5-8973-42fc-9597-b0193a479da5-console-config\") pod \"console-568d4776df-zlwzf\" (UID: \"135026f5-8973-42fc-9597-b0193a479da5\") " pod="openshift-console/console-568d4776df-zlwzf" Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.122282 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/135026f5-8973-42fc-9597-b0193a479da5-console-oauth-config\") pod \"console-568d4776df-zlwzf\" (UID: \"135026f5-8973-42fc-9597-b0193a479da5\") " pod="openshift-console/console-568d4776df-zlwzf" Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.122412 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/135026f5-8973-42fc-9597-b0193a479da5-console-serving-cert\") pod \"console-568d4776df-zlwzf\" (UID: \"135026f5-8973-42fc-9597-b0193a479da5\") " pod="openshift-console/console-568d4776df-zlwzf" Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.149828 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt5pz\" (UniqueName: \"kubernetes.io/projected/135026f5-8973-42fc-9597-b0193a479da5-kube-api-access-lt5pz\") pod \"console-568d4776df-zlwzf\" (UID: \"135026f5-8973-42fc-9597-b0193a479da5\") " pod="openshift-console/console-568d4776df-zlwzf" Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.217073 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-568d4776df-zlwzf" Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.446734 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-phn8q"] Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.450840 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-pdmhs"] Dec 09 15:13:53 crc kubenswrapper[4555]: W1209 15:13:53.452217 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41e8b8f7_9b28_46f9_b474_9d519d0e7875.slice/crio-ef56fc08a3eeee791a4e36f513bf3482c4275c7b06222a81e6cac36fe74a0f6b WatchSource:0}: Error finding container ef56fc08a3eeee791a4e36f513bf3482c4275c7b06222a81e6cac36fe74a0f6b: Status 404 returned error can't find the container with id ef56fc08a3eeee791a4e36f513bf3482c4275c7b06222a81e6cac36fe74a0f6b Dec 09 15:13:53 crc kubenswrapper[4555]: W1209 15:13:53.452446 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06ff1c84_8959_4b54_9fea_12a6d8ed97d5.slice/crio-475c082f1c1dda4bab0343a7ca5c1303d0a00d0d8247cf07a86cc4d38d07b43a WatchSource:0}: Error finding container 475c082f1c1dda4bab0343a7ca5c1303d0a00d0d8247cf07a86cc4d38d07b43a: Status 404 returned error can't find the container with id 475c082f1c1dda4bab0343a7ca5c1303d0a00d0d8247cf07a86cc4d38d07b43a Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.559186 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kcpqb"] Dec 09 15:13:53 crc kubenswrapper[4555]: W1209 15:13:53.560608 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41bb2d5e_b40e_4088_9ffa_ac0234d7050c.slice/crio-bcd90ca843b4d1f1b8fee2bc4048a2fdd0f19f8698d0a8fddb9b32ca5333753b WatchSource:0}: Error finding container bcd90ca843b4d1f1b8fee2bc4048a2fdd0f19f8698d0a8fddb9b32ca5333753b: Status 404 returned error can't find the container with id bcd90ca843b4d1f1b8fee2bc4048a2fdd0f19f8698d0a8fddb9b32ca5333753b Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.606521 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-568d4776df-zlwzf"] Dec 09 15:13:53 crc kubenswrapper[4555]: W1209 15:13:53.607509 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod135026f5_8973_42fc_9597_b0193a479da5.slice/crio-dfd688fe8158aca46b3bc4bc480f0b38a9bbfec5f714dcdf9f88d03fedd31bbc WatchSource:0}: Error finding container dfd688fe8158aca46b3bc4bc480f0b38a9bbfec5f714dcdf9f88d03fedd31bbc: Status 404 returned error can't find the container with id dfd688fe8158aca46b3bc4bc480f0b38a9bbfec5f714dcdf9f88d03fedd31bbc Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.846330 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-568d4776df-zlwzf" event={"ID":"135026f5-8973-42fc-9597-b0193a479da5","Type":"ContainerStarted","Data":"16cb98ab8746cd63e3f8a5de9b4c956fce553c4707a23d685d28f0b70c286965"} Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.846670 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-568d4776df-zlwzf" event={"ID":"135026f5-8973-42fc-9597-b0193a479da5","Type":"ContainerStarted","Data":"dfd688fe8158aca46b3bc4bc480f0b38a9bbfec5f714dcdf9f88d03fedd31bbc"} Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.848508 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-t6lcd" event={"ID":"96ba4ce4-cba2-47a3-b353-e8f3daedfa5a","Type":"ContainerStarted","Data":"9ffa9710b754edf6096daa48963d8d134e84fe65f457bda2d2ac7f1d2ec59619"} Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.849983 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-pdmhs" event={"ID":"06ff1c84-8959-4b54-9fea-12a6d8ed97d5","Type":"ContainerStarted","Data":"475c082f1c1dda4bab0343a7ca5c1303d0a00d0d8247cf07a86cc4d38d07b43a"} Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.851687 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kcpqb" event={"ID":"41bb2d5e-b40e-4088-9ffa-ac0234d7050c","Type":"ContainerStarted","Data":"bcd90ca843b4d1f1b8fee2bc4048a2fdd0f19f8698d0a8fddb9b32ca5333753b"} Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.852502 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-phn8q" event={"ID":"41e8b8f7-9b28-46f9-b474-9d519d0e7875","Type":"ContainerStarted","Data":"ef56fc08a3eeee791a4e36f513bf3482c4275c7b06222a81e6cac36fe74a0f6b"} Dec 09 15:13:53 crc kubenswrapper[4555]: I1209 15:13:53.864621 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-568d4776df-zlwzf" podStartSLOduration=1.8646080550000002 podStartE2EDuration="1.864608055s" podCreationTimestamp="2025-12-09 15:13:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:13:53.864084287 +0000 UTC m=+944.000797005" watchObservedRunningTime="2025-12-09 15:13:53.864608055 +0000 UTC m=+944.001320773" Dec 09 15:13:55 crc kubenswrapper[4555]: I1209 15:13:55.531074 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:13:55 crc kubenswrapper[4555]: I1209 15:13:55.531253 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:13:55 crc kubenswrapper[4555]: I1209 15:13:55.531288 4555 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" Dec 09 15:13:55 crc kubenswrapper[4555]: I1209 15:13:55.531834 4555 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fa5fc33cd5d536ecfcb65483f7919b247963e43ee3446fdfb5d114321c1858f9"} pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 15:13:55 crc kubenswrapper[4555]: I1209 15:13:55.531887 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" containerID="cri-o://fa5fc33cd5d536ecfcb65483f7919b247963e43ee3446fdfb5d114321c1858f9" gracePeriod=600 Dec 09 15:13:55 crc kubenswrapper[4555]: I1209 15:13:55.869011 4555 generic.go:334] "Generic (PLEG): container finished" podID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerID="fa5fc33cd5d536ecfcb65483f7919b247963e43ee3446fdfb5d114321c1858f9" exitCode=0 Dec 09 15:13:55 crc kubenswrapper[4555]: I1209 15:13:55.869067 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" event={"ID":"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36","Type":"ContainerDied","Data":"fa5fc33cd5d536ecfcb65483f7919b247963e43ee3446fdfb5d114321c1858f9"} Dec 09 15:13:55 crc kubenswrapper[4555]: I1209 15:13:55.869096 4555 scope.go:117] "RemoveContainer" containerID="1349c5c8b355954a96a1c092ad88f43ece6bf94e84e0fb6591265f020c3fe076" Dec 09 15:13:56 crc kubenswrapper[4555]: I1209 15:13:56.879069 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-t6lcd" event={"ID":"96ba4ce4-cba2-47a3-b353-e8f3daedfa5a","Type":"ContainerStarted","Data":"c43707979fe4ba0af46380fa83b7278307497fa28d4ea2938160525acbb973d8"} Dec 09 15:13:56 crc kubenswrapper[4555]: I1209 15:13:56.879691 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-t6lcd" Dec 09 15:13:56 crc kubenswrapper[4555]: I1209 15:13:56.880969 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-pdmhs" event={"ID":"06ff1c84-8959-4b54-9fea-12a6d8ed97d5","Type":"ContainerStarted","Data":"2800b11ffe9069dc04fc0b1bb1391fbfca36e310174409ba3f6ef37d9f0ddbab"} Dec 09 15:13:56 crc kubenswrapper[4555]: I1209 15:13:56.881428 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-pdmhs" Dec 09 15:13:56 crc kubenswrapper[4555]: I1209 15:13:56.882569 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kcpqb" event={"ID":"41bb2d5e-b40e-4088-9ffa-ac0234d7050c","Type":"ContainerStarted","Data":"57e89727c500276385845c296a22400b20415c5853aeac8082e940773b11a7de"} Dec 09 15:13:56 crc kubenswrapper[4555]: I1209 15:13:56.885060 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" event={"ID":"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36","Type":"ContainerStarted","Data":"80bba19695acae3b426d36d271b4edb8798293963becf54942b939c9706ad20f"} Dec 09 15:13:56 crc kubenswrapper[4555]: I1209 15:13:56.886812 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-phn8q" event={"ID":"41e8b8f7-9b28-46f9-b474-9d519d0e7875","Type":"ContainerStarted","Data":"48cc50a97df4c93345c830d97c879c4d9b20dd0fae42c9c5543c192d6d82fe34"} Dec 09 15:13:56 crc kubenswrapper[4555]: I1209 15:13:56.894892 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-t6lcd" podStartSLOduration=1.9613493530000001 podStartE2EDuration="4.894877549s" podCreationTimestamp="2025-12-09 15:13:52 +0000 UTC" firstStartedPulling="2025-12-09 15:13:53.041221513 +0000 UTC m=+943.177934230" lastFinishedPulling="2025-12-09 15:13:55.974749708 +0000 UTC m=+946.111462426" observedRunningTime="2025-12-09 15:13:56.891248956 +0000 UTC m=+947.027961673" watchObservedRunningTime="2025-12-09 15:13:56.894877549 +0000 UTC m=+947.031590268" Dec 09 15:13:56 crc kubenswrapper[4555]: I1209 15:13:56.915010 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-pdmhs" podStartSLOduration=2.394851853 podStartE2EDuration="4.914992628s" podCreationTimestamp="2025-12-09 15:13:52 +0000 UTC" firstStartedPulling="2025-12-09 15:13:53.454597322 +0000 UTC m=+943.591310039" lastFinishedPulling="2025-12-09 15:13:55.974738095 +0000 UTC m=+946.111450814" observedRunningTime="2025-12-09 15:13:56.912011445 +0000 UTC m=+947.048724163" watchObservedRunningTime="2025-12-09 15:13:56.914992628 +0000 UTC m=+947.051705346" Dec 09 15:13:56 crc kubenswrapper[4555]: I1209 15:13:56.927931 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kcpqb" podStartSLOduration=2.515586977 podStartE2EDuration="4.92792032s" podCreationTimestamp="2025-12-09 15:13:52 +0000 UTC" firstStartedPulling="2025-12-09 15:13:53.562470566 +0000 UTC m=+943.699183285" lastFinishedPulling="2025-12-09 15:13:55.97480391 +0000 UTC m=+946.111516628" observedRunningTime="2025-12-09 15:13:56.923294245 +0000 UTC m=+947.060006964" watchObservedRunningTime="2025-12-09 15:13:56.92792032 +0000 UTC m=+947.064633037" Dec 09 15:13:58 crc kubenswrapper[4555]: I1209 15:13:58.898737 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-phn8q" event={"ID":"41e8b8f7-9b28-46f9-b474-9d519d0e7875","Type":"ContainerStarted","Data":"73c2eef45b487c131fe520ffe173922c11402188e2134c0b6b7627d8a618c159"} Dec 09 15:13:58 crc kubenswrapper[4555]: I1209 15:13:58.913982 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-phn8q" podStartSLOduration=2.163613415 podStartE2EDuration="6.913966182s" podCreationTimestamp="2025-12-09 15:13:52 +0000 UTC" firstStartedPulling="2025-12-09 15:13:53.454134158 +0000 UTC m=+943.590846876" lastFinishedPulling="2025-12-09 15:13:58.204486925 +0000 UTC m=+948.341199643" observedRunningTime="2025-12-09 15:13:58.910816571 +0000 UTC m=+949.047529289" watchObservedRunningTime="2025-12-09 15:13:58.913966182 +0000 UTC m=+949.050678900" Dec 09 15:14:02 crc kubenswrapper[4555]: I1209 15:14:02.996259 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-t6lcd" Dec 09 15:14:03 crc kubenswrapper[4555]: I1209 15:14:03.218212 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-568d4776df-zlwzf" Dec 09 15:14:03 crc kubenswrapper[4555]: I1209 15:14:03.218248 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-568d4776df-zlwzf" Dec 09 15:14:03 crc kubenswrapper[4555]: I1209 15:14:03.222581 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-568d4776df-zlwzf" Dec 09 15:14:03 crc kubenswrapper[4555]: I1209 15:14:03.923837 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-568d4776df-zlwzf" Dec 09 15:14:03 crc kubenswrapper[4555]: I1209 15:14:03.960063 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58c6fb9d58-n5r2c"] Dec 09 15:14:12 crc kubenswrapper[4555]: I1209 15:14:12.988782 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-pdmhs" Dec 09 15:14:24 crc kubenswrapper[4555]: I1209 15:14:24.534355 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4"] Dec 09 15:14:24 crc kubenswrapper[4555]: I1209 15:14:24.537592 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4" Dec 09 15:14:24 crc kubenswrapper[4555]: I1209 15:14:24.544217 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 09 15:14:24 crc kubenswrapper[4555]: I1209 15:14:24.551171 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4"] Dec 09 15:14:24 crc kubenswrapper[4555]: I1209 15:14:24.604313 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/59e2835f-791d-4fa3-b024-5d28b08c0bf1-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4\" (UID: \"59e2835f-791d-4fa3-b024-5d28b08c0bf1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4" Dec 09 15:14:24 crc kubenswrapper[4555]: I1209 15:14:24.604364 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/59e2835f-791d-4fa3-b024-5d28b08c0bf1-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4\" (UID: \"59e2835f-791d-4fa3-b024-5d28b08c0bf1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4" Dec 09 15:14:24 crc kubenswrapper[4555]: I1209 15:14:24.604391 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxzgg\" (UniqueName: \"kubernetes.io/projected/59e2835f-791d-4fa3-b024-5d28b08c0bf1-kube-api-access-qxzgg\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4\" (UID: \"59e2835f-791d-4fa3-b024-5d28b08c0bf1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4" Dec 09 15:14:24 crc kubenswrapper[4555]: I1209 15:14:24.705424 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/59e2835f-791d-4fa3-b024-5d28b08c0bf1-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4\" (UID: \"59e2835f-791d-4fa3-b024-5d28b08c0bf1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4" Dec 09 15:14:24 crc kubenswrapper[4555]: I1209 15:14:24.705480 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/59e2835f-791d-4fa3-b024-5d28b08c0bf1-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4\" (UID: \"59e2835f-791d-4fa3-b024-5d28b08c0bf1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4" Dec 09 15:14:24 crc kubenswrapper[4555]: I1209 15:14:24.705510 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxzgg\" (UniqueName: \"kubernetes.io/projected/59e2835f-791d-4fa3-b024-5d28b08c0bf1-kube-api-access-qxzgg\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4\" (UID: \"59e2835f-791d-4fa3-b024-5d28b08c0bf1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4" Dec 09 15:14:24 crc kubenswrapper[4555]: I1209 15:14:24.705891 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/59e2835f-791d-4fa3-b024-5d28b08c0bf1-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4\" (UID: \"59e2835f-791d-4fa3-b024-5d28b08c0bf1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4" Dec 09 15:14:24 crc kubenswrapper[4555]: I1209 15:14:24.705902 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/59e2835f-791d-4fa3-b024-5d28b08c0bf1-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4\" (UID: \"59e2835f-791d-4fa3-b024-5d28b08c0bf1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4" Dec 09 15:14:24 crc kubenswrapper[4555]: I1209 15:14:24.721279 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxzgg\" (UniqueName: \"kubernetes.io/projected/59e2835f-791d-4fa3-b024-5d28b08c0bf1-kube-api-access-qxzgg\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4\" (UID: \"59e2835f-791d-4fa3-b024-5d28b08c0bf1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4" Dec 09 15:14:24 crc kubenswrapper[4555]: I1209 15:14:24.856443 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4" Dec 09 15:14:25 crc kubenswrapper[4555]: I1209 15:14:25.200605 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4"] Dec 09 15:14:25 crc kubenswrapper[4555]: W1209 15:14:25.203881 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59e2835f_791d_4fa3_b024_5d28b08c0bf1.slice/crio-644877586a67dce97be90bb1dfc4c2764649f6271edeea19d33a8cf8f874aa51 WatchSource:0}: Error finding container 644877586a67dce97be90bb1dfc4c2764649f6271edeea19d33a8cf8f874aa51: Status 404 returned error can't find the container with id 644877586a67dce97be90bb1dfc4c2764649f6271edeea19d33a8cf8f874aa51 Dec 09 15:14:26 crc kubenswrapper[4555]: I1209 15:14:26.040050 4555 generic.go:334] "Generic (PLEG): container finished" podID="59e2835f-791d-4fa3-b024-5d28b08c0bf1" containerID="3100da48445c01c011c94fbe07c5512ce89fa67fe13bf0981af7d6f670e9bb62" exitCode=0 Dec 09 15:14:26 crc kubenswrapper[4555]: I1209 15:14:26.040086 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4" event={"ID":"59e2835f-791d-4fa3-b024-5d28b08c0bf1","Type":"ContainerDied","Data":"3100da48445c01c011c94fbe07c5512ce89fa67fe13bf0981af7d6f670e9bb62"} Dec 09 15:14:26 crc kubenswrapper[4555]: I1209 15:14:26.040109 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4" event={"ID":"59e2835f-791d-4fa3-b024-5d28b08c0bf1","Type":"ContainerStarted","Data":"644877586a67dce97be90bb1dfc4c2764649f6271edeea19d33a8cf8f874aa51"} Dec 09 15:14:28 crc kubenswrapper[4555]: I1209 15:14:28.051116 4555 generic.go:334] "Generic (PLEG): container finished" podID="59e2835f-791d-4fa3-b024-5d28b08c0bf1" containerID="f7eb8487c3026dfa1a8e7425507d03c7f4a91296761ec9887be3c6e5d8088e89" exitCode=0 Dec 09 15:14:28 crc kubenswrapper[4555]: I1209 15:14:28.051157 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4" event={"ID":"59e2835f-791d-4fa3-b024-5d28b08c0bf1","Type":"ContainerDied","Data":"f7eb8487c3026dfa1a8e7425507d03c7f4a91296761ec9887be3c6e5d8088e89"} Dec 09 15:14:28 crc kubenswrapper[4555]: I1209 15:14:28.986385 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-58c6fb9d58-n5r2c" podUID="e80fda3a-44a2-4671-899a-b041186264b5" containerName="console" containerID="cri-o://022f771bbe19f1266d06ea313b354900c231527e2cb369b81f99e19878b29f6f" gracePeriod=15 Dec 09 15:14:29 crc kubenswrapper[4555]: I1209 15:14:29.058107 4555 generic.go:334] "Generic (PLEG): container finished" podID="59e2835f-791d-4fa3-b024-5d28b08c0bf1" containerID="461ef1cbe87f66d63d866fd3559bb724a36bb38499f14e570ec8b6fa5d6b2f88" exitCode=0 Dec 09 15:14:29 crc kubenswrapper[4555]: I1209 15:14:29.058183 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4" event={"ID":"59e2835f-791d-4fa3-b024-5d28b08c0bf1","Type":"ContainerDied","Data":"461ef1cbe87f66d63d866fd3559bb724a36bb38499f14e570ec8b6fa5d6b2f88"} Dec 09 15:14:29 crc kubenswrapper[4555]: I1209 15:14:29.308289 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58c6fb9d58-n5r2c_e80fda3a-44a2-4671-899a-b041186264b5/console/0.log" Dec 09 15:14:29 crc kubenswrapper[4555]: I1209 15:14:29.308358 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58c6fb9d58-n5r2c" Dec 09 15:14:29 crc kubenswrapper[4555]: I1209 15:14:29.474651 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e80fda3a-44a2-4671-899a-b041186264b5-console-oauth-config\") pod \"e80fda3a-44a2-4671-899a-b041186264b5\" (UID: \"e80fda3a-44a2-4671-899a-b041186264b5\") " Dec 09 15:14:29 crc kubenswrapper[4555]: I1209 15:14:29.474706 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6bft\" (UniqueName: \"kubernetes.io/projected/e80fda3a-44a2-4671-899a-b041186264b5-kube-api-access-c6bft\") pod \"e80fda3a-44a2-4671-899a-b041186264b5\" (UID: \"e80fda3a-44a2-4671-899a-b041186264b5\") " Dec 09 15:14:29 crc kubenswrapper[4555]: I1209 15:14:29.474727 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e80fda3a-44a2-4671-899a-b041186264b5-service-ca\") pod \"e80fda3a-44a2-4671-899a-b041186264b5\" (UID: \"e80fda3a-44a2-4671-899a-b041186264b5\") " Dec 09 15:14:29 crc kubenswrapper[4555]: I1209 15:14:29.474770 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e80fda3a-44a2-4671-899a-b041186264b5-trusted-ca-bundle\") pod \"e80fda3a-44a2-4671-899a-b041186264b5\" (UID: \"e80fda3a-44a2-4671-899a-b041186264b5\") " Dec 09 15:14:29 crc kubenswrapper[4555]: I1209 15:14:29.474795 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e80fda3a-44a2-4671-899a-b041186264b5-oauth-serving-cert\") pod \"e80fda3a-44a2-4671-899a-b041186264b5\" (UID: \"e80fda3a-44a2-4671-899a-b041186264b5\") " Dec 09 15:14:29 crc kubenswrapper[4555]: I1209 15:14:29.474830 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e80fda3a-44a2-4671-899a-b041186264b5-console-config\") pod \"e80fda3a-44a2-4671-899a-b041186264b5\" (UID: \"e80fda3a-44a2-4671-899a-b041186264b5\") " Dec 09 15:14:29 crc kubenswrapper[4555]: I1209 15:14:29.474880 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e80fda3a-44a2-4671-899a-b041186264b5-console-serving-cert\") pod \"e80fda3a-44a2-4671-899a-b041186264b5\" (UID: \"e80fda3a-44a2-4671-899a-b041186264b5\") " Dec 09 15:14:29 crc kubenswrapper[4555]: I1209 15:14:29.475452 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e80fda3a-44a2-4671-899a-b041186264b5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e80fda3a-44a2-4671-899a-b041186264b5" (UID: "e80fda3a-44a2-4671-899a-b041186264b5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:14:29 crc kubenswrapper[4555]: I1209 15:14:29.475478 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e80fda3a-44a2-4671-899a-b041186264b5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e80fda3a-44a2-4671-899a-b041186264b5" (UID: "e80fda3a-44a2-4671-899a-b041186264b5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:14:29 crc kubenswrapper[4555]: I1209 15:14:29.475466 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e80fda3a-44a2-4671-899a-b041186264b5-service-ca" (OuterVolumeSpecName: "service-ca") pod "e80fda3a-44a2-4671-899a-b041186264b5" (UID: "e80fda3a-44a2-4671-899a-b041186264b5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:14:29 crc kubenswrapper[4555]: I1209 15:14:29.475660 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e80fda3a-44a2-4671-899a-b041186264b5-console-config" (OuterVolumeSpecName: "console-config") pod "e80fda3a-44a2-4671-899a-b041186264b5" (UID: "e80fda3a-44a2-4671-899a-b041186264b5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:14:29 crc kubenswrapper[4555]: I1209 15:14:29.478984 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e80fda3a-44a2-4671-899a-b041186264b5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e80fda3a-44a2-4671-899a-b041186264b5" (UID: "e80fda3a-44a2-4671-899a-b041186264b5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:14:29 crc kubenswrapper[4555]: I1209 15:14:29.479020 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e80fda3a-44a2-4671-899a-b041186264b5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e80fda3a-44a2-4671-899a-b041186264b5" (UID: "e80fda3a-44a2-4671-899a-b041186264b5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:14:29 crc kubenswrapper[4555]: I1209 15:14:29.479119 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e80fda3a-44a2-4671-899a-b041186264b5-kube-api-access-c6bft" (OuterVolumeSpecName: "kube-api-access-c6bft") pod "e80fda3a-44a2-4671-899a-b041186264b5" (UID: "e80fda3a-44a2-4671-899a-b041186264b5"). InnerVolumeSpecName "kube-api-access-c6bft". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:14:29 crc kubenswrapper[4555]: I1209 15:14:29.576486 4555 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e80fda3a-44a2-4671-899a-b041186264b5-console-config\") on node \"crc\" DevicePath \"\"" Dec 09 15:14:29 crc kubenswrapper[4555]: I1209 15:14:29.576514 4555 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e80fda3a-44a2-4671-899a-b041186264b5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 15:14:29 crc kubenswrapper[4555]: I1209 15:14:29.576523 4555 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e80fda3a-44a2-4671-899a-b041186264b5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 09 15:14:29 crc kubenswrapper[4555]: I1209 15:14:29.576532 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6bft\" (UniqueName: \"kubernetes.io/projected/e80fda3a-44a2-4671-899a-b041186264b5-kube-api-access-c6bft\") on node \"crc\" DevicePath \"\"" Dec 09 15:14:29 crc kubenswrapper[4555]: I1209 15:14:29.576541 4555 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e80fda3a-44a2-4671-899a-b041186264b5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 15:14:29 crc kubenswrapper[4555]: I1209 15:14:29.576549 4555 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e80fda3a-44a2-4671-899a-b041186264b5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 15:14:29 crc kubenswrapper[4555]: I1209 15:14:29.576555 4555 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e80fda3a-44a2-4671-899a-b041186264b5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 15:14:30 crc kubenswrapper[4555]: I1209 15:14:30.064811 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58c6fb9d58-n5r2c_e80fda3a-44a2-4671-899a-b041186264b5/console/0.log" Dec 09 15:14:30 crc kubenswrapper[4555]: I1209 15:14:30.065042 4555 generic.go:334] "Generic (PLEG): container finished" podID="e80fda3a-44a2-4671-899a-b041186264b5" containerID="022f771bbe19f1266d06ea313b354900c231527e2cb369b81f99e19878b29f6f" exitCode=2 Dec 09 15:14:30 crc kubenswrapper[4555]: I1209 15:14:30.065105 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58c6fb9d58-n5r2c" Dec 09 15:14:30 crc kubenswrapper[4555]: I1209 15:14:30.065097 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58c6fb9d58-n5r2c" event={"ID":"e80fda3a-44a2-4671-899a-b041186264b5","Type":"ContainerDied","Data":"022f771bbe19f1266d06ea313b354900c231527e2cb369b81f99e19878b29f6f"} Dec 09 15:14:30 crc kubenswrapper[4555]: I1209 15:14:30.065159 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58c6fb9d58-n5r2c" event={"ID":"e80fda3a-44a2-4671-899a-b041186264b5","Type":"ContainerDied","Data":"8695f24c55619ab8e1b1b5797b4e7956d4210ecf4fd3928b4c9d8ed021f27072"} Dec 09 15:14:30 crc kubenswrapper[4555]: I1209 15:14:30.065181 4555 scope.go:117] "RemoveContainer" containerID="022f771bbe19f1266d06ea313b354900c231527e2cb369b81f99e19878b29f6f" Dec 09 15:14:30 crc kubenswrapper[4555]: I1209 15:14:30.080187 4555 scope.go:117] "RemoveContainer" containerID="022f771bbe19f1266d06ea313b354900c231527e2cb369b81f99e19878b29f6f" Dec 09 15:14:30 crc kubenswrapper[4555]: E1209 15:14:30.080632 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"022f771bbe19f1266d06ea313b354900c231527e2cb369b81f99e19878b29f6f\": container with ID starting with 022f771bbe19f1266d06ea313b354900c231527e2cb369b81f99e19878b29f6f not found: ID does not exist" containerID="022f771bbe19f1266d06ea313b354900c231527e2cb369b81f99e19878b29f6f" Dec 09 15:14:30 crc kubenswrapper[4555]: I1209 15:14:30.080663 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"022f771bbe19f1266d06ea313b354900c231527e2cb369b81f99e19878b29f6f"} err="failed to get container status \"022f771bbe19f1266d06ea313b354900c231527e2cb369b81f99e19878b29f6f\": rpc error: code = NotFound desc = could not find container \"022f771bbe19f1266d06ea313b354900c231527e2cb369b81f99e19878b29f6f\": container with ID starting with 022f771bbe19f1266d06ea313b354900c231527e2cb369b81f99e19878b29f6f not found: ID does not exist" Dec 09 15:14:30 crc kubenswrapper[4555]: I1209 15:14:30.088909 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58c6fb9d58-n5r2c"] Dec 09 15:14:30 crc kubenswrapper[4555]: I1209 15:14:30.092993 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-58c6fb9d58-n5r2c"] Dec 09 15:14:30 crc kubenswrapper[4555]: I1209 15:14:30.171868 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e80fda3a-44a2-4671-899a-b041186264b5" path="/var/lib/kubelet/pods/e80fda3a-44a2-4671-899a-b041186264b5/volumes" Dec 09 15:14:30 crc kubenswrapper[4555]: I1209 15:14:30.273059 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4" Dec 09 15:14:30 crc kubenswrapper[4555]: I1209 15:14:30.388968 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/59e2835f-791d-4fa3-b024-5d28b08c0bf1-bundle\") pod \"59e2835f-791d-4fa3-b024-5d28b08c0bf1\" (UID: \"59e2835f-791d-4fa3-b024-5d28b08c0bf1\") " Dec 09 15:14:30 crc kubenswrapper[4555]: I1209 15:14:30.389028 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxzgg\" (UniqueName: \"kubernetes.io/projected/59e2835f-791d-4fa3-b024-5d28b08c0bf1-kube-api-access-qxzgg\") pod \"59e2835f-791d-4fa3-b024-5d28b08c0bf1\" (UID: \"59e2835f-791d-4fa3-b024-5d28b08c0bf1\") " Dec 09 15:14:30 crc kubenswrapper[4555]: I1209 15:14:30.389099 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/59e2835f-791d-4fa3-b024-5d28b08c0bf1-util\") pod \"59e2835f-791d-4fa3-b024-5d28b08c0bf1\" (UID: \"59e2835f-791d-4fa3-b024-5d28b08c0bf1\") " Dec 09 15:14:30 crc kubenswrapper[4555]: I1209 15:14:30.389808 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59e2835f-791d-4fa3-b024-5d28b08c0bf1-bundle" (OuterVolumeSpecName: "bundle") pod "59e2835f-791d-4fa3-b024-5d28b08c0bf1" (UID: "59e2835f-791d-4fa3-b024-5d28b08c0bf1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:14:30 crc kubenswrapper[4555]: I1209 15:14:30.392495 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59e2835f-791d-4fa3-b024-5d28b08c0bf1-kube-api-access-qxzgg" (OuterVolumeSpecName: "kube-api-access-qxzgg") pod "59e2835f-791d-4fa3-b024-5d28b08c0bf1" (UID: "59e2835f-791d-4fa3-b024-5d28b08c0bf1"). InnerVolumeSpecName "kube-api-access-qxzgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:14:30 crc kubenswrapper[4555]: I1209 15:14:30.398864 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59e2835f-791d-4fa3-b024-5d28b08c0bf1-util" (OuterVolumeSpecName: "util") pod "59e2835f-791d-4fa3-b024-5d28b08c0bf1" (UID: "59e2835f-791d-4fa3-b024-5d28b08c0bf1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:14:30 crc kubenswrapper[4555]: I1209 15:14:30.490471 4555 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/59e2835f-791d-4fa3-b024-5d28b08c0bf1-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 15:14:30 crc kubenswrapper[4555]: I1209 15:14:30.490497 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxzgg\" (UniqueName: \"kubernetes.io/projected/59e2835f-791d-4fa3-b024-5d28b08c0bf1-kube-api-access-qxzgg\") on node \"crc\" DevicePath \"\"" Dec 09 15:14:30 crc kubenswrapper[4555]: I1209 15:14:30.490508 4555 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/59e2835f-791d-4fa3-b024-5d28b08c0bf1-util\") on node \"crc\" DevicePath \"\"" Dec 09 15:14:31 crc kubenswrapper[4555]: I1209 15:14:31.072407 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4" event={"ID":"59e2835f-791d-4fa3-b024-5d28b08c0bf1","Type":"ContainerDied","Data":"644877586a67dce97be90bb1dfc4c2764649f6271edeea19d33a8cf8f874aa51"} Dec 09 15:14:31 crc kubenswrapper[4555]: I1209 15:14:31.073121 4555 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="644877586a67dce97be90bb1dfc4c2764649f6271edeea19d33a8cf8f874aa51" Dec 09 15:14:31 crc kubenswrapper[4555]: I1209 15:14:31.072467 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4" Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.611621 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-67986b9b4b-shcnp"] Dec 09 15:14:39 crc kubenswrapper[4555]: E1209 15:14:39.612181 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e80fda3a-44a2-4671-899a-b041186264b5" containerName="console" Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.612193 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="e80fda3a-44a2-4671-899a-b041186264b5" containerName="console" Dec 09 15:14:39 crc kubenswrapper[4555]: E1209 15:14:39.612204 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59e2835f-791d-4fa3-b024-5d28b08c0bf1" containerName="pull" Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.612210 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e2835f-791d-4fa3-b024-5d28b08c0bf1" containerName="pull" Dec 09 15:14:39 crc kubenswrapper[4555]: E1209 15:14:39.612217 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59e2835f-791d-4fa3-b024-5d28b08c0bf1" containerName="extract" Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.612223 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e2835f-791d-4fa3-b024-5d28b08c0bf1" containerName="extract" Dec 09 15:14:39 crc kubenswrapper[4555]: E1209 15:14:39.612244 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59e2835f-791d-4fa3-b024-5d28b08c0bf1" containerName="util" Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.612250 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e2835f-791d-4fa3-b024-5d28b08c0bf1" containerName="util" Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.612390 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="e80fda3a-44a2-4671-899a-b041186264b5" containerName="console" Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.612400 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="59e2835f-791d-4fa3-b024-5d28b08c0bf1" containerName="extract" Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.612835 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-67986b9b4b-shcnp" Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.614432 4555 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.615739 4555 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-b57k4" Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.616071 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.616075 4555 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.616140 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.626615 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-67986b9b4b-shcnp"] Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.715160 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g27lt\" (UniqueName: \"kubernetes.io/projected/a1efc90d-b721-4bc6-a9e3-74280e8f541d-kube-api-access-g27lt\") pod \"metallb-operator-controller-manager-67986b9b4b-shcnp\" (UID: \"a1efc90d-b721-4bc6-a9e3-74280e8f541d\") " pod="metallb-system/metallb-operator-controller-manager-67986b9b4b-shcnp" Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.715315 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a1efc90d-b721-4bc6-a9e3-74280e8f541d-apiservice-cert\") pod \"metallb-operator-controller-manager-67986b9b4b-shcnp\" (UID: \"a1efc90d-b721-4bc6-a9e3-74280e8f541d\") " pod="metallb-system/metallb-operator-controller-manager-67986b9b4b-shcnp" Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.715353 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a1efc90d-b721-4bc6-a9e3-74280e8f541d-webhook-cert\") pod \"metallb-operator-controller-manager-67986b9b4b-shcnp\" (UID: \"a1efc90d-b721-4bc6-a9e3-74280e8f541d\") " pod="metallb-system/metallb-operator-controller-manager-67986b9b4b-shcnp" Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.817393 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g27lt\" (UniqueName: \"kubernetes.io/projected/a1efc90d-b721-4bc6-a9e3-74280e8f541d-kube-api-access-g27lt\") pod \"metallb-operator-controller-manager-67986b9b4b-shcnp\" (UID: \"a1efc90d-b721-4bc6-a9e3-74280e8f541d\") " pod="metallb-system/metallb-operator-controller-manager-67986b9b4b-shcnp" Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.817479 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a1efc90d-b721-4bc6-a9e3-74280e8f541d-apiservice-cert\") pod \"metallb-operator-controller-manager-67986b9b4b-shcnp\" (UID: \"a1efc90d-b721-4bc6-a9e3-74280e8f541d\") " pod="metallb-system/metallb-operator-controller-manager-67986b9b4b-shcnp" Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.817523 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a1efc90d-b721-4bc6-a9e3-74280e8f541d-webhook-cert\") pod \"metallb-operator-controller-manager-67986b9b4b-shcnp\" (UID: \"a1efc90d-b721-4bc6-a9e3-74280e8f541d\") " pod="metallb-system/metallb-operator-controller-manager-67986b9b4b-shcnp" Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.823986 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a1efc90d-b721-4bc6-a9e3-74280e8f541d-webhook-cert\") pod \"metallb-operator-controller-manager-67986b9b4b-shcnp\" (UID: \"a1efc90d-b721-4bc6-a9e3-74280e8f541d\") " pod="metallb-system/metallb-operator-controller-manager-67986b9b4b-shcnp" Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.824723 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a1efc90d-b721-4bc6-a9e3-74280e8f541d-apiservice-cert\") pod \"metallb-operator-controller-manager-67986b9b4b-shcnp\" (UID: \"a1efc90d-b721-4bc6-a9e3-74280e8f541d\") " pod="metallb-system/metallb-operator-controller-manager-67986b9b4b-shcnp" Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.836145 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g27lt\" (UniqueName: \"kubernetes.io/projected/a1efc90d-b721-4bc6-a9e3-74280e8f541d-kube-api-access-g27lt\") pod \"metallb-operator-controller-manager-67986b9b4b-shcnp\" (UID: \"a1efc90d-b721-4bc6-a9e3-74280e8f541d\") " pod="metallb-system/metallb-operator-controller-manager-67986b9b4b-shcnp" Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.848604 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-9b87bb679-nztx8"] Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.849392 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-9b87bb679-nztx8" Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.851123 4555 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-8rth2" Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.851123 4555 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.851566 4555 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.859280 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-9b87bb679-nztx8"] Dec 09 15:14:39 crc kubenswrapper[4555]: I1209 15:14:39.927125 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-67986b9b4b-shcnp" Dec 09 15:14:40 crc kubenswrapper[4555]: I1209 15:14:40.021495 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/345488ea-f298-4caa-9128-0bcbca8581f0-apiservice-cert\") pod \"metallb-operator-webhook-server-9b87bb679-nztx8\" (UID: \"345488ea-f298-4caa-9128-0bcbca8581f0\") " pod="metallb-system/metallb-operator-webhook-server-9b87bb679-nztx8" Dec 09 15:14:40 crc kubenswrapper[4555]: I1209 15:14:40.021699 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqjkt\" (UniqueName: \"kubernetes.io/projected/345488ea-f298-4caa-9128-0bcbca8581f0-kube-api-access-hqjkt\") pod \"metallb-operator-webhook-server-9b87bb679-nztx8\" (UID: \"345488ea-f298-4caa-9128-0bcbca8581f0\") " pod="metallb-system/metallb-operator-webhook-server-9b87bb679-nztx8" Dec 09 15:14:40 crc kubenswrapper[4555]: I1209 15:14:40.021721 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/345488ea-f298-4caa-9128-0bcbca8581f0-webhook-cert\") pod \"metallb-operator-webhook-server-9b87bb679-nztx8\" (UID: \"345488ea-f298-4caa-9128-0bcbca8581f0\") " pod="metallb-system/metallb-operator-webhook-server-9b87bb679-nztx8" Dec 09 15:14:40 crc kubenswrapper[4555]: I1209 15:14:40.122829 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/345488ea-f298-4caa-9128-0bcbca8581f0-apiservice-cert\") pod \"metallb-operator-webhook-server-9b87bb679-nztx8\" (UID: \"345488ea-f298-4caa-9128-0bcbca8581f0\") " pod="metallb-system/metallb-operator-webhook-server-9b87bb679-nztx8" Dec 09 15:14:40 crc kubenswrapper[4555]: I1209 15:14:40.122891 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqjkt\" (UniqueName: \"kubernetes.io/projected/345488ea-f298-4caa-9128-0bcbca8581f0-kube-api-access-hqjkt\") pod \"metallb-operator-webhook-server-9b87bb679-nztx8\" (UID: \"345488ea-f298-4caa-9128-0bcbca8581f0\") " pod="metallb-system/metallb-operator-webhook-server-9b87bb679-nztx8" Dec 09 15:14:40 crc kubenswrapper[4555]: I1209 15:14:40.122908 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/345488ea-f298-4caa-9128-0bcbca8581f0-webhook-cert\") pod \"metallb-operator-webhook-server-9b87bb679-nztx8\" (UID: \"345488ea-f298-4caa-9128-0bcbca8581f0\") " pod="metallb-system/metallb-operator-webhook-server-9b87bb679-nztx8" Dec 09 15:14:40 crc kubenswrapper[4555]: I1209 15:14:40.130086 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/345488ea-f298-4caa-9128-0bcbca8581f0-apiservice-cert\") pod \"metallb-operator-webhook-server-9b87bb679-nztx8\" (UID: \"345488ea-f298-4caa-9128-0bcbca8581f0\") " pod="metallb-system/metallb-operator-webhook-server-9b87bb679-nztx8" Dec 09 15:14:40 crc kubenswrapper[4555]: I1209 15:14:40.131943 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/345488ea-f298-4caa-9128-0bcbca8581f0-webhook-cert\") pod \"metallb-operator-webhook-server-9b87bb679-nztx8\" (UID: \"345488ea-f298-4caa-9128-0bcbca8581f0\") " pod="metallb-system/metallb-operator-webhook-server-9b87bb679-nztx8" Dec 09 15:14:40 crc kubenswrapper[4555]: I1209 15:14:40.136977 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqjkt\" (UniqueName: \"kubernetes.io/projected/345488ea-f298-4caa-9128-0bcbca8581f0-kube-api-access-hqjkt\") pod \"metallb-operator-webhook-server-9b87bb679-nztx8\" (UID: \"345488ea-f298-4caa-9128-0bcbca8581f0\") " pod="metallb-system/metallb-operator-webhook-server-9b87bb679-nztx8" Dec 09 15:14:40 crc kubenswrapper[4555]: I1209 15:14:40.188853 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-9b87bb679-nztx8" Dec 09 15:14:40 crc kubenswrapper[4555]: I1209 15:14:40.299260 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-67986b9b4b-shcnp"] Dec 09 15:14:40 crc kubenswrapper[4555]: I1209 15:14:40.311056 4555 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 15:14:40 crc kubenswrapper[4555]: I1209 15:14:40.548360 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-9b87bb679-nztx8"] Dec 09 15:14:41 crc kubenswrapper[4555]: I1209 15:14:41.123553 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-67986b9b4b-shcnp" event={"ID":"a1efc90d-b721-4bc6-a9e3-74280e8f541d","Type":"ContainerStarted","Data":"5a26ecfe8486fdc65770da73833e721de9ce80e4a8a0c69eebbf23732e196649"} Dec 09 15:14:41 crc kubenswrapper[4555]: I1209 15:14:41.124513 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-9b87bb679-nztx8" event={"ID":"345488ea-f298-4caa-9128-0bcbca8581f0","Type":"ContainerStarted","Data":"6e0a736f9912a695e34228e9d759c5689228fc9d0dc9dda4a63ce126e854ab44"} Dec 09 15:14:44 crc kubenswrapper[4555]: I1209 15:14:44.142688 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-67986b9b4b-shcnp" event={"ID":"a1efc90d-b721-4bc6-a9e3-74280e8f541d","Type":"ContainerStarted","Data":"8b0b43bf0d187387fdc3c656726058732caf0fa0b43f75686772697c1fd500cd"} Dec 09 15:14:44 crc kubenswrapper[4555]: I1209 15:14:44.143808 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-67986b9b4b-shcnp" Dec 09 15:14:44 crc kubenswrapper[4555]: I1209 15:14:44.161051 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-67986b9b4b-shcnp" podStartSLOduration=1.721326537 podStartE2EDuration="5.161036551s" podCreationTimestamp="2025-12-09 15:14:39 +0000 UTC" firstStartedPulling="2025-12-09 15:14:40.310813159 +0000 UTC m=+990.447525876" lastFinishedPulling="2025-12-09 15:14:43.750523171 +0000 UTC m=+993.887235890" observedRunningTime="2025-12-09 15:14:44.156712285 +0000 UTC m=+994.293425004" watchObservedRunningTime="2025-12-09 15:14:44.161036551 +0000 UTC m=+994.297749269" Dec 09 15:14:47 crc kubenswrapper[4555]: I1209 15:14:47.163172 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-9b87bb679-nztx8" event={"ID":"345488ea-f298-4caa-9128-0bcbca8581f0","Type":"ContainerStarted","Data":"2222d8e1286b482806de3bddf8aeea00a8968c1b6998862f0b2943e4ac20357d"} Dec 09 15:14:47 crc kubenswrapper[4555]: I1209 15:14:47.163611 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-9b87bb679-nztx8" Dec 09 15:14:47 crc kubenswrapper[4555]: I1209 15:14:47.177715 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-9b87bb679-nztx8" podStartSLOduration=1.7359540679999999 podStartE2EDuration="8.177707078s" podCreationTimestamp="2025-12-09 15:14:39 +0000 UTC" firstStartedPulling="2025-12-09 15:14:40.555223349 +0000 UTC m=+990.691936068" lastFinishedPulling="2025-12-09 15:14:46.99697636 +0000 UTC m=+997.133689078" observedRunningTime="2025-12-09 15:14:47.175089421 +0000 UTC m=+997.311802149" watchObservedRunningTime="2025-12-09 15:14:47.177707078 +0000 UTC m=+997.314419797" Dec 09 15:15:00 crc kubenswrapper[4555]: I1209 15:15:00.134619 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421555-vlfbm"] Dec 09 15:15:00 crc kubenswrapper[4555]: I1209 15:15:00.135795 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-vlfbm" Dec 09 15:15:00 crc kubenswrapper[4555]: I1209 15:15:00.137115 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 15:15:00 crc kubenswrapper[4555]: I1209 15:15:00.137167 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 15:15:00 crc kubenswrapper[4555]: I1209 15:15:00.140524 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421555-vlfbm"] Dec 09 15:15:00 crc kubenswrapper[4555]: I1209 15:15:00.192257 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-9b87bb679-nztx8" Dec 09 15:15:00 crc kubenswrapper[4555]: I1209 15:15:00.313932 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/43ec2a34-4530-4f67-ab99-937daf5420e2-config-volume\") pod \"collect-profiles-29421555-vlfbm\" (UID: \"43ec2a34-4530-4f67-ab99-937daf5420e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-vlfbm" Dec 09 15:15:00 crc kubenswrapper[4555]: I1209 15:15:00.313988 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blw6r\" (UniqueName: \"kubernetes.io/projected/43ec2a34-4530-4f67-ab99-937daf5420e2-kube-api-access-blw6r\") pod \"collect-profiles-29421555-vlfbm\" (UID: \"43ec2a34-4530-4f67-ab99-937daf5420e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-vlfbm" Dec 09 15:15:00 crc kubenswrapper[4555]: I1209 15:15:00.314130 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/43ec2a34-4530-4f67-ab99-937daf5420e2-secret-volume\") pod \"collect-profiles-29421555-vlfbm\" (UID: \"43ec2a34-4530-4f67-ab99-937daf5420e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-vlfbm" Dec 09 15:15:00 crc kubenswrapper[4555]: I1209 15:15:00.424408 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/43ec2a34-4530-4f67-ab99-937daf5420e2-config-volume\") pod \"collect-profiles-29421555-vlfbm\" (UID: \"43ec2a34-4530-4f67-ab99-937daf5420e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-vlfbm" Dec 09 15:15:00 crc kubenswrapper[4555]: I1209 15:15:00.424463 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blw6r\" (UniqueName: \"kubernetes.io/projected/43ec2a34-4530-4f67-ab99-937daf5420e2-kube-api-access-blw6r\") pod \"collect-profiles-29421555-vlfbm\" (UID: \"43ec2a34-4530-4f67-ab99-937daf5420e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-vlfbm" Dec 09 15:15:00 crc kubenswrapper[4555]: I1209 15:15:00.424777 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/43ec2a34-4530-4f67-ab99-937daf5420e2-secret-volume\") pod \"collect-profiles-29421555-vlfbm\" (UID: \"43ec2a34-4530-4f67-ab99-937daf5420e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-vlfbm" Dec 09 15:15:00 crc kubenswrapper[4555]: I1209 15:15:00.425575 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/43ec2a34-4530-4f67-ab99-937daf5420e2-config-volume\") pod \"collect-profiles-29421555-vlfbm\" (UID: \"43ec2a34-4530-4f67-ab99-937daf5420e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-vlfbm" Dec 09 15:15:00 crc kubenswrapper[4555]: I1209 15:15:00.429381 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/43ec2a34-4530-4f67-ab99-937daf5420e2-secret-volume\") pod \"collect-profiles-29421555-vlfbm\" (UID: \"43ec2a34-4530-4f67-ab99-937daf5420e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-vlfbm" Dec 09 15:15:00 crc kubenswrapper[4555]: I1209 15:15:00.438230 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blw6r\" (UniqueName: \"kubernetes.io/projected/43ec2a34-4530-4f67-ab99-937daf5420e2-kube-api-access-blw6r\") pod \"collect-profiles-29421555-vlfbm\" (UID: \"43ec2a34-4530-4f67-ab99-937daf5420e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-vlfbm" Dec 09 15:15:00 crc kubenswrapper[4555]: I1209 15:15:00.450060 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-vlfbm" Dec 09 15:15:00 crc kubenswrapper[4555]: I1209 15:15:00.803836 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421555-vlfbm"] Dec 09 15:15:00 crc kubenswrapper[4555]: E1209 15:15:00.822132 4555 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43ec2a34_4530_4f67_ab99_937daf5420e2.slice/crio-12e895ed0c7dbd7b17d9ee583783f2c2da97b2efeac36121e6b5041075d54a0b\": RecentStats: unable to find data in memory cache]" Dec 09 15:15:01 crc kubenswrapper[4555]: I1209 15:15:01.238447 4555 generic.go:334] "Generic (PLEG): container finished" podID="43ec2a34-4530-4f67-ab99-937daf5420e2" containerID="631e30e0bb3fc865a19f88be0208370bf8b0494c88f884612bcc9b589ba5fcef" exitCode=0 Dec 09 15:15:01 crc kubenswrapper[4555]: I1209 15:15:01.238490 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-vlfbm" event={"ID":"43ec2a34-4530-4f67-ab99-937daf5420e2","Type":"ContainerDied","Data":"631e30e0bb3fc865a19f88be0208370bf8b0494c88f884612bcc9b589ba5fcef"} Dec 09 15:15:01 crc kubenswrapper[4555]: I1209 15:15:01.238512 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-vlfbm" event={"ID":"43ec2a34-4530-4f67-ab99-937daf5420e2","Type":"ContainerStarted","Data":"12e895ed0c7dbd7b17d9ee583783f2c2da97b2efeac36121e6b5041075d54a0b"} Dec 09 15:15:02 crc kubenswrapper[4555]: I1209 15:15:02.492086 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-vlfbm" Dec 09 15:15:02 crc kubenswrapper[4555]: I1209 15:15:02.555556 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blw6r\" (UniqueName: \"kubernetes.io/projected/43ec2a34-4530-4f67-ab99-937daf5420e2-kube-api-access-blw6r\") pod \"43ec2a34-4530-4f67-ab99-937daf5420e2\" (UID: \"43ec2a34-4530-4f67-ab99-937daf5420e2\") " Dec 09 15:15:02 crc kubenswrapper[4555]: I1209 15:15:02.555659 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/43ec2a34-4530-4f67-ab99-937daf5420e2-secret-volume\") pod \"43ec2a34-4530-4f67-ab99-937daf5420e2\" (UID: \"43ec2a34-4530-4f67-ab99-937daf5420e2\") " Dec 09 15:15:02 crc kubenswrapper[4555]: I1209 15:15:02.555774 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/43ec2a34-4530-4f67-ab99-937daf5420e2-config-volume\") pod \"43ec2a34-4530-4f67-ab99-937daf5420e2\" (UID: \"43ec2a34-4530-4f67-ab99-937daf5420e2\") " Dec 09 15:15:02 crc kubenswrapper[4555]: I1209 15:15:02.558021 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43ec2a34-4530-4f67-ab99-937daf5420e2-config-volume" (OuterVolumeSpecName: "config-volume") pod "43ec2a34-4530-4f67-ab99-937daf5420e2" (UID: "43ec2a34-4530-4f67-ab99-937daf5420e2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:15:02 crc kubenswrapper[4555]: I1209 15:15:02.560421 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43ec2a34-4530-4f67-ab99-937daf5420e2-kube-api-access-blw6r" (OuterVolumeSpecName: "kube-api-access-blw6r") pod "43ec2a34-4530-4f67-ab99-937daf5420e2" (UID: "43ec2a34-4530-4f67-ab99-937daf5420e2"). InnerVolumeSpecName "kube-api-access-blw6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:15:02 crc kubenswrapper[4555]: I1209 15:15:02.560430 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ec2a34-4530-4f67-ab99-937daf5420e2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "43ec2a34-4530-4f67-ab99-937daf5420e2" (UID: "43ec2a34-4530-4f67-ab99-937daf5420e2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:15:02 crc kubenswrapper[4555]: I1209 15:15:02.657262 4555 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/43ec2a34-4530-4f67-ab99-937daf5420e2-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 15:15:02 crc kubenswrapper[4555]: I1209 15:15:02.657318 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blw6r\" (UniqueName: \"kubernetes.io/projected/43ec2a34-4530-4f67-ab99-937daf5420e2-kube-api-access-blw6r\") on node \"crc\" DevicePath \"\"" Dec 09 15:15:02 crc kubenswrapper[4555]: I1209 15:15:02.657329 4555 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/43ec2a34-4530-4f67-ab99-937daf5420e2-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 15:15:03 crc kubenswrapper[4555]: I1209 15:15:03.250859 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-vlfbm" event={"ID":"43ec2a34-4530-4f67-ab99-937daf5420e2","Type":"ContainerDied","Data":"12e895ed0c7dbd7b17d9ee583783f2c2da97b2efeac36121e6b5041075d54a0b"} Dec 09 15:15:03 crc kubenswrapper[4555]: I1209 15:15:03.250897 4555 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12e895ed0c7dbd7b17d9ee583783f2c2da97b2efeac36121e6b5041075d54a0b" Dec 09 15:15:03 crc kubenswrapper[4555]: I1209 15:15:03.251116 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-vlfbm" Dec 09 15:15:19 crc kubenswrapper[4555]: I1209 15:15:19.929211 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-67986b9b4b-shcnp" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.444439 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-z7j4m"] Dec 09 15:15:20 crc kubenswrapper[4555]: E1209 15:15:20.444763 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43ec2a34-4530-4f67-ab99-937daf5420e2" containerName="collect-profiles" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.444778 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ec2a34-4530-4f67-ab99-937daf5420e2" containerName="collect-profiles" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.444931 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="43ec2a34-4530-4f67-ab99-937daf5420e2" containerName="collect-profiles" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.445494 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-z7j4m" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.448108 4555 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.448320 4555 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-26znm" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.449023 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-kkbk6"] Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.453736 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-kkbk6" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.459233 4555 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.461855 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.461865 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-z7j4m"] Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.496641 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/9a99fb67-9306-46ea-ab6e-cc58491f5772-frr-sockets\") pod \"frr-k8s-kkbk6\" (UID: \"9a99fb67-9306-46ea-ab6e-cc58491f5772\") " pod="metallb-system/frr-k8s-kkbk6" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.496774 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a99fb67-9306-46ea-ab6e-cc58491f5772-metrics-certs\") pod \"frr-k8s-kkbk6\" (UID: \"9a99fb67-9306-46ea-ab6e-cc58491f5772\") " pod="metallb-system/frr-k8s-kkbk6" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.496799 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kw57\" (UniqueName: \"kubernetes.io/projected/9a99fb67-9306-46ea-ab6e-cc58491f5772-kube-api-access-9kw57\") pod \"frr-k8s-kkbk6\" (UID: \"9a99fb67-9306-46ea-ab6e-cc58491f5772\") " pod="metallb-system/frr-k8s-kkbk6" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.496824 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/9a99fb67-9306-46ea-ab6e-cc58491f5772-frr-startup\") pod \"frr-k8s-kkbk6\" (UID: \"9a99fb67-9306-46ea-ab6e-cc58491f5772\") " pod="metallb-system/frr-k8s-kkbk6" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.496898 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/60ab11ad-6b93-44bb-855d-f913d995440a-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-z7j4m\" (UID: \"60ab11ad-6b93-44bb-855d-f913d995440a\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-z7j4m" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.496931 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/9a99fb67-9306-46ea-ab6e-cc58491f5772-frr-conf\") pod \"frr-k8s-kkbk6\" (UID: \"9a99fb67-9306-46ea-ab6e-cc58491f5772\") " pod="metallb-system/frr-k8s-kkbk6" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.497001 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/9a99fb67-9306-46ea-ab6e-cc58491f5772-metrics\") pod \"frr-k8s-kkbk6\" (UID: \"9a99fb67-9306-46ea-ab6e-cc58491f5772\") " pod="metallb-system/frr-k8s-kkbk6" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.497237 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jwjk\" (UniqueName: \"kubernetes.io/projected/60ab11ad-6b93-44bb-855d-f913d995440a-kube-api-access-5jwjk\") pod \"frr-k8s-webhook-server-7fcb986d4-z7j4m\" (UID: \"60ab11ad-6b93-44bb-855d-f913d995440a\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-z7j4m" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.497266 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/9a99fb67-9306-46ea-ab6e-cc58491f5772-reloader\") pod \"frr-k8s-kkbk6\" (UID: \"9a99fb67-9306-46ea-ab6e-cc58491f5772\") " pod="metallb-system/frr-k8s-kkbk6" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.530937 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-hn5bl"] Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.532094 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-hn5bl" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.534477 4555 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.534652 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.534767 4555 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.534806 4555 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-wkbc2" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.544524 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-x79zw"] Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.545637 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-x79zw" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.547818 4555 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.549895 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-x79zw"] Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.599029 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/9a99fb67-9306-46ea-ab6e-cc58491f5772-frr-startup\") pod \"frr-k8s-kkbk6\" (UID: \"9a99fb67-9306-46ea-ab6e-cc58491f5772\") " pod="metallb-system/frr-k8s-kkbk6" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.599098 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/60ab11ad-6b93-44bb-855d-f913d995440a-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-z7j4m\" (UID: \"60ab11ad-6b93-44bb-855d-f913d995440a\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-z7j4m" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.599130 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/9a99fb67-9306-46ea-ab6e-cc58491f5772-frr-conf\") pod \"frr-k8s-kkbk6\" (UID: \"9a99fb67-9306-46ea-ab6e-cc58491f5772\") " pod="metallb-system/frr-k8s-kkbk6" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.599150 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc65x\" (UniqueName: \"kubernetes.io/projected/3c14ab57-d872-4657-b6bc-30ab1752110a-kube-api-access-jc65x\") pod \"controller-f8648f98b-x79zw\" (UID: \"3c14ab57-d872-4657-b6bc-30ab1752110a\") " pod="metallb-system/controller-f8648f98b-x79zw" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.599176 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/9a99fb67-9306-46ea-ab6e-cc58491f5772-metrics\") pod \"frr-k8s-kkbk6\" (UID: \"9a99fb67-9306-46ea-ab6e-cc58491f5772\") " pod="metallb-system/frr-k8s-kkbk6" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.599192 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ed284bf2-6ecf-4453-8a50-b951e38f8129-metallb-excludel2\") pod \"speaker-hn5bl\" (UID: \"ed284bf2-6ecf-4453-8a50-b951e38f8129\") " pod="metallb-system/speaker-hn5bl" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.599218 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c14ab57-d872-4657-b6bc-30ab1752110a-cert\") pod \"controller-f8648f98b-x79zw\" (UID: \"3c14ab57-d872-4657-b6bc-30ab1752110a\") " pod="metallb-system/controller-f8648f98b-x79zw" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.599261 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jwjk\" (UniqueName: \"kubernetes.io/projected/60ab11ad-6b93-44bb-855d-f913d995440a-kube-api-access-5jwjk\") pod \"frr-k8s-webhook-server-7fcb986d4-z7j4m\" (UID: \"60ab11ad-6b93-44bb-855d-f913d995440a\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-z7j4m" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.599275 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/9a99fb67-9306-46ea-ab6e-cc58491f5772-reloader\") pod \"frr-k8s-kkbk6\" (UID: \"9a99fb67-9306-46ea-ab6e-cc58491f5772\") " pod="metallb-system/frr-k8s-kkbk6" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.599337 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcr2c\" (UniqueName: \"kubernetes.io/projected/ed284bf2-6ecf-4453-8a50-b951e38f8129-kube-api-access-rcr2c\") pod \"speaker-hn5bl\" (UID: \"ed284bf2-6ecf-4453-8a50-b951e38f8129\") " pod="metallb-system/speaker-hn5bl" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.599356 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed284bf2-6ecf-4453-8a50-b951e38f8129-metrics-certs\") pod \"speaker-hn5bl\" (UID: \"ed284bf2-6ecf-4453-8a50-b951e38f8129\") " pod="metallb-system/speaker-hn5bl" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.599421 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c14ab57-d872-4657-b6bc-30ab1752110a-metrics-certs\") pod \"controller-f8648f98b-x79zw\" (UID: \"3c14ab57-d872-4657-b6bc-30ab1752110a\") " pod="metallb-system/controller-f8648f98b-x79zw" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.599439 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/9a99fb67-9306-46ea-ab6e-cc58491f5772-frr-sockets\") pod \"frr-k8s-kkbk6\" (UID: \"9a99fb67-9306-46ea-ab6e-cc58491f5772\") " pod="metallb-system/frr-k8s-kkbk6" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.599478 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ed284bf2-6ecf-4453-8a50-b951e38f8129-memberlist\") pod \"speaker-hn5bl\" (UID: \"ed284bf2-6ecf-4453-8a50-b951e38f8129\") " pod="metallb-system/speaker-hn5bl" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.599502 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a99fb67-9306-46ea-ab6e-cc58491f5772-metrics-certs\") pod \"frr-k8s-kkbk6\" (UID: \"9a99fb67-9306-46ea-ab6e-cc58491f5772\") " pod="metallb-system/frr-k8s-kkbk6" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.599518 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kw57\" (UniqueName: \"kubernetes.io/projected/9a99fb67-9306-46ea-ab6e-cc58491f5772-kube-api-access-9kw57\") pod \"frr-k8s-kkbk6\" (UID: \"9a99fb67-9306-46ea-ab6e-cc58491f5772\") " pod="metallb-system/frr-k8s-kkbk6" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.599951 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/9a99fb67-9306-46ea-ab6e-cc58491f5772-frr-startup\") pod \"frr-k8s-kkbk6\" (UID: \"9a99fb67-9306-46ea-ab6e-cc58491f5772\") " pod="metallb-system/frr-k8s-kkbk6" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.600191 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/9a99fb67-9306-46ea-ab6e-cc58491f5772-reloader\") pod \"frr-k8s-kkbk6\" (UID: \"9a99fb67-9306-46ea-ab6e-cc58491f5772\") " pod="metallb-system/frr-k8s-kkbk6" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.600385 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/9a99fb67-9306-46ea-ab6e-cc58491f5772-frr-conf\") pod \"frr-k8s-kkbk6\" (UID: \"9a99fb67-9306-46ea-ab6e-cc58491f5772\") " pod="metallb-system/frr-k8s-kkbk6" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.600566 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/9a99fb67-9306-46ea-ab6e-cc58491f5772-metrics\") pod \"frr-k8s-kkbk6\" (UID: \"9a99fb67-9306-46ea-ab6e-cc58491f5772\") " pod="metallb-system/frr-k8s-kkbk6" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.600732 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/9a99fb67-9306-46ea-ab6e-cc58491f5772-frr-sockets\") pod \"frr-k8s-kkbk6\" (UID: \"9a99fb67-9306-46ea-ab6e-cc58491f5772\") " pod="metallb-system/frr-k8s-kkbk6" Dec 09 15:15:20 crc kubenswrapper[4555]: E1209 15:15:20.600798 4555 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 09 15:15:20 crc kubenswrapper[4555]: E1209 15:15:20.600839 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a99fb67-9306-46ea-ab6e-cc58491f5772-metrics-certs podName:9a99fb67-9306-46ea-ab6e-cc58491f5772 nodeName:}" failed. No retries permitted until 2025-12-09 15:15:21.100826798 +0000 UTC m=+1031.237539517 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a99fb67-9306-46ea-ab6e-cc58491f5772-metrics-certs") pod "frr-k8s-kkbk6" (UID: "9a99fb67-9306-46ea-ab6e-cc58491f5772") : secret "frr-k8s-certs-secret" not found Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.607731 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/60ab11ad-6b93-44bb-855d-f913d995440a-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-z7j4m\" (UID: \"60ab11ad-6b93-44bb-855d-f913d995440a\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-z7j4m" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.614135 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jwjk\" (UniqueName: \"kubernetes.io/projected/60ab11ad-6b93-44bb-855d-f913d995440a-kube-api-access-5jwjk\") pod \"frr-k8s-webhook-server-7fcb986d4-z7j4m\" (UID: \"60ab11ad-6b93-44bb-855d-f913d995440a\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-z7j4m" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.617067 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kw57\" (UniqueName: \"kubernetes.io/projected/9a99fb67-9306-46ea-ab6e-cc58491f5772-kube-api-access-9kw57\") pod \"frr-k8s-kkbk6\" (UID: \"9a99fb67-9306-46ea-ab6e-cc58491f5772\") " pod="metallb-system/frr-k8s-kkbk6" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.701020 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c14ab57-d872-4657-b6bc-30ab1752110a-metrics-certs\") pod \"controller-f8648f98b-x79zw\" (UID: \"3c14ab57-d872-4657-b6bc-30ab1752110a\") " pod="metallb-system/controller-f8648f98b-x79zw" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.701098 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ed284bf2-6ecf-4453-8a50-b951e38f8129-memberlist\") pod \"speaker-hn5bl\" (UID: \"ed284bf2-6ecf-4453-8a50-b951e38f8129\") " pod="metallb-system/speaker-hn5bl" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.701193 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc65x\" (UniqueName: \"kubernetes.io/projected/3c14ab57-d872-4657-b6bc-30ab1752110a-kube-api-access-jc65x\") pod \"controller-f8648f98b-x79zw\" (UID: \"3c14ab57-d872-4657-b6bc-30ab1752110a\") " pod="metallb-system/controller-f8648f98b-x79zw" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.701220 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ed284bf2-6ecf-4453-8a50-b951e38f8129-metallb-excludel2\") pod \"speaker-hn5bl\" (UID: \"ed284bf2-6ecf-4453-8a50-b951e38f8129\") " pod="metallb-system/speaker-hn5bl" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.701247 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c14ab57-d872-4657-b6bc-30ab1752110a-cert\") pod \"controller-f8648f98b-x79zw\" (UID: \"3c14ab57-d872-4657-b6bc-30ab1752110a\") " pod="metallb-system/controller-f8648f98b-x79zw" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.701318 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcr2c\" (UniqueName: \"kubernetes.io/projected/ed284bf2-6ecf-4453-8a50-b951e38f8129-kube-api-access-rcr2c\") pod \"speaker-hn5bl\" (UID: \"ed284bf2-6ecf-4453-8a50-b951e38f8129\") " pod="metallb-system/speaker-hn5bl" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.701333 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed284bf2-6ecf-4453-8a50-b951e38f8129-metrics-certs\") pod \"speaker-hn5bl\" (UID: \"ed284bf2-6ecf-4453-8a50-b951e38f8129\") " pod="metallb-system/speaker-hn5bl" Dec 09 15:15:20 crc kubenswrapper[4555]: E1209 15:15:20.701610 4555 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 09 15:15:20 crc kubenswrapper[4555]: E1209 15:15:20.701685 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c14ab57-d872-4657-b6bc-30ab1752110a-metrics-certs podName:3c14ab57-d872-4657-b6bc-30ab1752110a nodeName:}" failed. No retries permitted until 2025-12-09 15:15:21.201665826 +0000 UTC m=+1031.338378544 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3c14ab57-d872-4657-b6bc-30ab1752110a-metrics-certs") pod "controller-f8648f98b-x79zw" (UID: "3c14ab57-d872-4657-b6bc-30ab1752110a") : secret "controller-certs-secret" not found Dec 09 15:15:20 crc kubenswrapper[4555]: E1209 15:15:20.701832 4555 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 09 15:15:20 crc kubenswrapper[4555]: E1209 15:15:20.701938 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed284bf2-6ecf-4453-8a50-b951e38f8129-memberlist podName:ed284bf2-6ecf-4453-8a50-b951e38f8129 nodeName:}" failed. No retries permitted until 2025-12-09 15:15:21.201924304 +0000 UTC m=+1031.338637022 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ed284bf2-6ecf-4453-8a50-b951e38f8129-memberlist") pod "speaker-hn5bl" (UID: "ed284bf2-6ecf-4453-8a50-b951e38f8129") : secret "metallb-memberlist" not found Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.702409 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ed284bf2-6ecf-4453-8a50-b951e38f8129-metallb-excludel2\") pod \"speaker-hn5bl\" (UID: \"ed284bf2-6ecf-4453-8a50-b951e38f8129\") " pod="metallb-system/speaker-hn5bl" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.705538 4555 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.705700 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed284bf2-6ecf-4453-8a50-b951e38f8129-metrics-certs\") pod \"speaker-hn5bl\" (UID: \"ed284bf2-6ecf-4453-8a50-b951e38f8129\") " pod="metallb-system/speaker-hn5bl" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.716665 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c14ab57-d872-4657-b6bc-30ab1752110a-cert\") pod \"controller-f8648f98b-x79zw\" (UID: \"3c14ab57-d872-4657-b6bc-30ab1752110a\") " pod="metallb-system/controller-f8648f98b-x79zw" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.720044 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc65x\" (UniqueName: \"kubernetes.io/projected/3c14ab57-d872-4657-b6bc-30ab1752110a-kube-api-access-jc65x\") pod \"controller-f8648f98b-x79zw\" (UID: \"3c14ab57-d872-4657-b6bc-30ab1752110a\") " pod="metallb-system/controller-f8648f98b-x79zw" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.723678 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcr2c\" (UniqueName: \"kubernetes.io/projected/ed284bf2-6ecf-4453-8a50-b951e38f8129-kube-api-access-rcr2c\") pod \"speaker-hn5bl\" (UID: \"ed284bf2-6ecf-4453-8a50-b951e38f8129\") " pod="metallb-system/speaker-hn5bl" Dec 09 15:15:20 crc kubenswrapper[4555]: I1209 15:15:20.765329 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-z7j4m" Dec 09 15:15:21 crc kubenswrapper[4555]: I1209 15:15:21.111172 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a99fb67-9306-46ea-ab6e-cc58491f5772-metrics-certs\") pod \"frr-k8s-kkbk6\" (UID: \"9a99fb67-9306-46ea-ab6e-cc58491f5772\") " pod="metallb-system/frr-k8s-kkbk6" Dec 09 15:15:21 crc kubenswrapper[4555]: I1209 15:15:21.114479 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a99fb67-9306-46ea-ab6e-cc58491f5772-metrics-certs\") pod \"frr-k8s-kkbk6\" (UID: \"9a99fb67-9306-46ea-ab6e-cc58491f5772\") " pod="metallb-system/frr-k8s-kkbk6" Dec 09 15:15:21 crc kubenswrapper[4555]: I1209 15:15:21.136237 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-z7j4m"] Dec 09 15:15:21 crc kubenswrapper[4555]: I1209 15:15:21.212634 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c14ab57-d872-4657-b6bc-30ab1752110a-metrics-certs\") pod \"controller-f8648f98b-x79zw\" (UID: \"3c14ab57-d872-4657-b6bc-30ab1752110a\") " pod="metallb-system/controller-f8648f98b-x79zw" Dec 09 15:15:21 crc kubenswrapper[4555]: I1209 15:15:21.212700 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ed284bf2-6ecf-4453-8a50-b951e38f8129-memberlist\") pod \"speaker-hn5bl\" (UID: \"ed284bf2-6ecf-4453-8a50-b951e38f8129\") " pod="metallb-system/speaker-hn5bl" Dec 09 15:15:21 crc kubenswrapper[4555]: E1209 15:15:21.212996 4555 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 09 15:15:21 crc kubenswrapper[4555]: E1209 15:15:21.213176 4555 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed284bf2-6ecf-4453-8a50-b951e38f8129-memberlist podName:ed284bf2-6ecf-4453-8a50-b951e38f8129 nodeName:}" failed. No retries permitted until 2025-12-09 15:15:22.213154291 +0000 UTC m=+1032.349867009 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ed284bf2-6ecf-4453-8a50-b951e38f8129-memberlist") pod "speaker-hn5bl" (UID: "ed284bf2-6ecf-4453-8a50-b951e38f8129") : secret "metallb-memberlist" not found Dec 09 15:15:21 crc kubenswrapper[4555]: I1209 15:15:21.215944 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c14ab57-d872-4657-b6bc-30ab1752110a-metrics-certs\") pod \"controller-f8648f98b-x79zw\" (UID: \"3c14ab57-d872-4657-b6bc-30ab1752110a\") " pod="metallb-system/controller-f8648f98b-x79zw" Dec 09 15:15:21 crc kubenswrapper[4555]: I1209 15:15:21.341037 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-z7j4m" event={"ID":"60ab11ad-6b93-44bb-855d-f913d995440a","Type":"ContainerStarted","Data":"ec4929636fdf27c8aeea53677cde8007001d8f6f60cc8203d097a1802bf97827"} Dec 09 15:15:21 crc kubenswrapper[4555]: I1209 15:15:21.372807 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-kkbk6" Dec 09 15:15:21 crc kubenswrapper[4555]: I1209 15:15:21.465817 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-x79zw" Dec 09 15:15:21 crc kubenswrapper[4555]: I1209 15:15:21.808996 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-x79zw"] Dec 09 15:15:21 crc kubenswrapper[4555]: W1209 15:15:21.811622 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c14ab57_d872_4657_b6bc_30ab1752110a.slice/crio-e660c5822880481d0869d9addf025cb7e5d10fb68c3f12ecd648d6d0b16fba8b WatchSource:0}: Error finding container e660c5822880481d0869d9addf025cb7e5d10fb68c3f12ecd648d6d0b16fba8b: Status 404 returned error can't find the container with id e660c5822880481d0869d9addf025cb7e5d10fb68c3f12ecd648d6d0b16fba8b Dec 09 15:15:22 crc kubenswrapper[4555]: I1209 15:15:22.227051 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ed284bf2-6ecf-4453-8a50-b951e38f8129-memberlist\") pod \"speaker-hn5bl\" (UID: \"ed284bf2-6ecf-4453-8a50-b951e38f8129\") " pod="metallb-system/speaker-hn5bl" Dec 09 15:15:22 crc kubenswrapper[4555]: I1209 15:15:22.233370 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ed284bf2-6ecf-4453-8a50-b951e38f8129-memberlist\") pod \"speaker-hn5bl\" (UID: \"ed284bf2-6ecf-4453-8a50-b951e38f8129\") " pod="metallb-system/speaker-hn5bl" Dec 09 15:15:22 crc kubenswrapper[4555]: I1209 15:15:22.347018 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kkbk6" event={"ID":"9a99fb67-9306-46ea-ab6e-cc58491f5772","Type":"ContainerStarted","Data":"8318f7a46aa0e6cf2d81332397cbf4caf7ea76d46787d309b32c616b4ea33dfe"} Dec 09 15:15:22 crc kubenswrapper[4555]: I1209 15:15:22.348429 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-hn5bl" Dec 09 15:15:22 crc kubenswrapper[4555]: I1209 15:15:22.349240 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-x79zw" event={"ID":"3c14ab57-d872-4657-b6bc-30ab1752110a","Type":"ContainerStarted","Data":"04639f661694ea34f17d94328f7875d4f6a20e59abb1759b0334a898739e2e5c"} Dec 09 15:15:22 crc kubenswrapper[4555]: I1209 15:15:22.349276 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-x79zw" event={"ID":"3c14ab57-d872-4657-b6bc-30ab1752110a","Type":"ContainerStarted","Data":"520076520fd4c1c07cc95f5cf2f28a52e0768315613bc65f21f5daec979238de"} Dec 09 15:15:22 crc kubenswrapper[4555]: I1209 15:15:22.349286 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-x79zw" event={"ID":"3c14ab57-d872-4657-b6bc-30ab1752110a","Type":"ContainerStarted","Data":"e660c5822880481d0869d9addf025cb7e5d10fb68c3f12ecd648d6d0b16fba8b"} Dec 09 15:15:22 crc kubenswrapper[4555]: I1209 15:15:22.364349 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-x79zw" podStartSLOduration=2.3643389089999998 podStartE2EDuration="2.364338909s" podCreationTimestamp="2025-12-09 15:15:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:15:22.359417477 +0000 UTC m=+1032.496130195" watchObservedRunningTime="2025-12-09 15:15:22.364338909 +0000 UTC m=+1032.501051627" Dec 09 15:15:22 crc kubenswrapper[4555]: W1209 15:15:22.366853 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded284bf2_6ecf_4453_8a50_b951e38f8129.slice/crio-fc4c5201800302122a977105c749ff321fb7686d8eceb2d3b9eb7ad5f30b0eac WatchSource:0}: Error finding container fc4c5201800302122a977105c749ff321fb7686d8eceb2d3b9eb7ad5f30b0eac: Status 404 returned error can't find the container with id fc4c5201800302122a977105c749ff321fb7686d8eceb2d3b9eb7ad5f30b0eac Dec 09 15:15:23 crc kubenswrapper[4555]: I1209 15:15:23.379946 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-hn5bl" event={"ID":"ed284bf2-6ecf-4453-8a50-b951e38f8129","Type":"ContainerStarted","Data":"4ba761e7791ac0ccda3fab116e31d36228ce1e7f9da0f329d46b69e3f6c72686"} Dec 09 15:15:23 crc kubenswrapper[4555]: I1209 15:15:23.380010 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-x79zw" Dec 09 15:15:23 crc kubenswrapper[4555]: I1209 15:15:23.380022 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-hn5bl" event={"ID":"ed284bf2-6ecf-4453-8a50-b951e38f8129","Type":"ContainerStarted","Data":"8851cd36d560a0a2401e6468e293e5e9d3e9ddd172efd488f18369f4e6db6fb5"} Dec 09 15:15:23 crc kubenswrapper[4555]: I1209 15:15:23.380031 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-hn5bl" event={"ID":"ed284bf2-6ecf-4453-8a50-b951e38f8129","Type":"ContainerStarted","Data":"fc4c5201800302122a977105c749ff321fb7686d8eceb2d3b9eb7ad5f30b0eac"} Dec 09 15:15:23 crc kubenswrapper[4555]: I1209 15:15:23.380843 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-hn5bl" Dec 09 15:15:23 crc kubenswrapper[4555]: I1209 15:15:23.410884 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-hn5bl" podStartSLOduration=3.410868731 podStartE2EDuration="3.410868731s" podCreationTimestamp="2025-12-09 15:15:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:15:23.406658391 +0000 UTC m=+1033.543371109" watchObservedRunningTime="2025-12-09 15:15:23.410868731 +0000 UTC m=+1033.547581450" Dec 09 15:15:27 crc kubenswrapper[4555]: I1209 15:15:27.403862 4555 generic.go:334] "Generic (PLEG): container finished" podID="9a99fb67-9306-46ea-ab6e-cc58491f5772" containerID="dc641005f48eb136319a2f5df2459e5b0d7af4d7edcd4c9e3ffbfd19919541d6" exitCode=0 Dec 09 15:15:27 crc kubenswrapper[4555]: I1209 15:15:27.403902 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kkbk6" event={"ID":"9a99fb67-9306-46ea-ab6e-cc58491f5772","Type":"ContainerDied","Data":"dc641005f48eb136319a2f5df2459e5b0d7af4d7edcd4c9e3ffbfd19919541d6"} Dec 09 15:15:27 crc kubenswrapper[4555]: I1209 15:15:27.405180 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-z7j4m" event={"ID":"60ab11ad-6b93-44bb-855d-f913d995440a","Type":"ContainerStarted","Data":"b76f2fe4fbfdf4297b3f9dba39b8dafccc99406127b8a9de462d35763a9d09b9"} Dec 09 15:15:27 crc kubenswrapper[4555]: I1209 15:15:27.405337 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-z7j4m" Dec 09 15:15:27 crc kubenswrapper[4555]: I1209 15:15:27.434916 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-z7j4m" podStartSLOduration=1.478545752 podStartE2EDuration="7.434902426s" podCreationTimestamp="2025-12-09 15:15:20 +0000 UTC" firstStartedPulling="2025-12-09 15:15:21.14039831 +0000 UTC m=+1031.277111028" lastFinishedPulling="2025-12-09 15:15:27.096754983 +0000 UTC m=+1037.233467702" observedRunningTime="2025-12-09 15:15:27.430986769 +0000 UTC m=+1037.567699487" watchObservedRunningTime="2025-12-09 15:15:27.434902426 +0000 UTC m=+1037.571615143" Dec 09 15:15:28 crc kubenswrapper[4555]: I1209 15:15:28.412092 4555 generic.go:334] "Generic (PLEG): container finished" podID="9a99fb67-9306-46ea-ab6e-cc58491f5772" containerID="a22da10a0d5f367e64da1a3f22cb20d59f9eb303880930c6bb6c249ba4bb0c22" exitCode=0 Dec 09 15:15:28 crc kubenswrapper[4555]: I1209 15:15:28.412162 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kkbk6" event={"ID":"9a99fb67-9306-46ea-ab6e-cc58491f5772","Type":"ContainerDied","Data":"a22da10a0d5f367e64da1a3f22cb20d59f9eb303880930c6bb6c249ba4bb0c22"} Dec 09 15:15:29 crc kubenswrapper[4555]: I1209 15:15:29.419440 4555 generic.go:334] "Generic (PLEG): container finished" podID="9a99fb67-9306-46ea-ab6e-cc58491f5772" containerID="d79c6d902e235bc3f6811d4745de0d9c7175f6c41f30c4a840fb84f7e6e27b40" exitCode=0 Dec 09 15:15:29 crc kubenswrapper[4555]: I1209 15:15:29.419509 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kkbk6" event={"ID":"9a99fb67-9306-46ea-ab6e-cc58491f5772","Type":"ContainerDied","Data":"d79c6d902e235bc3f6811d4745de0d9c7175f6c41f30c4a840fb84f7e6e27b40"} Dec 09 15:15:30 crc kubenswrapper[4555]: I1209 15:15:30.427703 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kkbk6" event={"ID":"9a99fb67-9306-46ea-ab6e-cc58491f5772","Type":"ContainerStarted","Data":"f76d04a245e02b7f9e50a5f14532d2a8926825eb7bff5b31fe25d1d8f5e19ed8"} Dec 09 15:15:30 crc kubenswrapper[4555]: I1209 15:15:30.428261 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kkbk6" event={"ID":"9a99fb67-9306-46ea-ab6e-cc58491f5772","Type":"ContainerStarted","Data":"8e45fe518d05613cd130c7bed2b415128329de0d95754e2e9038ab5aa06a7fd0"} Dec 09 15:15:30 crc kubenswrapper[4555]: I1209 15:15:30.428274 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kkbk6" event={"ID":"9a99fb67-9306-46ea-ab6e-cc58491f5772","Type":"ContainerStarted","Data":"2d9ba923a318c4846acfd114cb9af1ec29a3890155e80549eb6faeb67bcd5fea"} Dec 09 15:15:30 crc kubenswrapper[4555]: I1209 15:15:30.428284 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kkbk6" event={"ID":"9a99fb67-9306-46ea-ab6e-cc58491f5772","Type":"ContainerStarted","Data":"8bcf3d8ea2acce382edf3fd361b56a7b93da0e8831d0d0af2ce7d850eaa0b219"} Dec 09 15:15:30 crc kubenswrapper[4555]: I1209 15:15:30.428293 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kkbk6" event={"ID":"9a99fb67-9306-46ea-ab6e-cc58491f5772","Type":"ContainerStarted","Data":"4450f0210e191f09ceaec8663fffcd142e271baa87d9de30a82efa5212d4f9b2"} Dec 09 15:15:30 crc kubenswrapper[4555]: I1209 15:15:30.428323 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-kkbk6" Dec 09 15:15:30 crc kubenswrapper[4555]: I1209 15:15:30.428333 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kkbk6" event={"ID":"9a99fb67-9306-46ea-ab6e-cc58491f5772","Type":"ContainerStarted","Data":"99893f210d52a718b3da8a430314638f00ba6a8f68e51915f10869586c95b9f2"} Dec 09 15:15:30 crc kubenswrapper[4555]: I1209 15:15:30.447359 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-kkbk6" podStartSLOduration=4.806780413 podStartE2EDuration="10.447345167s" podCreationTimestamp="2025-12-09 15:15:20 +0000 UTC" firstStartedPulling="2025-12-09 15:15:21.463122134 +0000 UTC m=+1031.599834853" lastFinishedPulling="2025-12-09 15:15:27.103686888 +0000 UTC m=+1037.240399607" observedRunningTime="2025-12-09 15:15:30.441824715 +0000 UTC m=+1040.578537433" watchObservedRunningTime="2025-12-09 15:15:30.447345167 +0000 UTC m=+1040.584057885" Dec 09 15:15:31 crc kubenswrapper[4555]: I1209 15:15:31.373657 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-kkbk6" Dec 09 15:15:31 crc kubenswrapper[4555]: I1209 15:15:31.401406 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-kkbk6" Dec 09 15:15:31 crc kubenswrapper[4555]: I1209 15:15:31.469931 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-x79zw" Dec 09 15:15:32 crc kubenswrapper[4555]: I1209 15:15:32.351183 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-hn5bl" Dec 09 15:15:34 crc kubenswrapper[4555]: I1209 15:15:34.521049 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-z2n5j"] Dec 09 15:15:34 crc kubenswrapper[4555]: I1209 15:15:34.522653 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-z2n5j" Dec 09 15:15:34 crc kubenswrapper[4555]: I1209 15:15:34.524318 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-6sv58" Dec 09 15:15:34 crc kubenswrapper[4555]: I1209 15:15:34.524554 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 09 15:15:34 crc kubenswrapper[4555]: I1209 15:15:34.524690 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 09 15:15:34 crc kubenswrapper[4555]: I1209 15:15:34.539785 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-z2n5j"] Dec 09 15:15:34 crc kubenswrapper[4555]: I1209 15:15:34.598211 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54j7p\" (UniqueName: \"kubernetes.io/projected/70577dfe-35b4-46d4-bccf-59d719879da2-kube-api-access-54j7p\") pod \"openstack-operator-index-z2n5j\" (UID: \"70577dfe-35b4-46d4-bccf-59d719879da2\") " pod="openstack-operators/openstack-operator-index-z2n5j" Dec 09 15:15:34 crc kubenswrapper[4555]: I1209 15:15:34.699786 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54j7p\" (UniqueName: \"kubernetes.io/projected/70577dfe-35b4-46d4-bccf-59d719879da2-kube-api-access-54j7p\") pod \"openstack-operator-index-z2n5j\" (UID: \"70577dfe-35b4-46d4-bccf-59d719879da2\") " pod="openstack-operators/openstack-operator-index-z2n5j" Dec 09 15:15:34 crc kubenswrapper[4555]: I1209 15:15:34.713725 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54j7p\" (UniqueName: \"kubernetes.io/projected/70577dfe-35b4-46d4-bccf-59d719879da2-kube-api-access-54j7p\") pod \"openstack-operator-index-z2n5j\" (UID: \"70577dfe-35b4-46d4-bccf-59d719879da2\") " pod="openstack-operators/openstack-operator-index-z2n5j" Dec 09 15:15:34 crc kubenswrapper[4555]: I1209 15:15:34.839699 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-z2n5j" Dec 09 15:15:35 crc kubenswrapper[4555]: I1209 15:15:35.186530 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-z2n5j"] Dec 09 15:15:35 crc kubenswrapper[4555]: W1209 15:15:35.189168 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70577dfe_35b4_46d4_bccf_59d719879da2.slice/crio-16426a3220c0a4000155862b386e50fbdb7af982825d1ed84f5095b01ec1bb33 WatchSource:0}: Error finding container 16426a3220c0a4000155862b386e50fbdb7af982825d1ed84f5095b01ec1bb33: Status 404 returned error can't find the container with id 16426a3220c0a4000155862b386e50fbdb7af982825d1ed84f5095b01ec1bb33 Dec 09 15:15:35 crc kubenswrapper[4555]: I1209 15:15:35.454216 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-z2n5j" event={"ID":"70577dfe-35b4-46d4-bccf-59d719879da2","Type":"ContainerStarted","Data":"16426a3220c0a4000155862b386e50fbdb7af982825d1ed84f5095b01ec1bb33"} Dec 09 15:15:37 crc kubenswrapper[4555]: I1209 15:15:37.707352 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-z2n5j"] Dec 09 15:15:38 crc kubenswrapper[4555]: I1209 15:15:38.311831 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-dqv6m"] Dec 09 15:15:38 crc kubenswrapper[4555]: I1209 15:15:38.312842 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dqv6m" Dec 09 15:15:38 crc kubenswrapper[4555]: I1209 15:15:38.317131 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dqv6m"] Dec 09 15:15:38 crc kubenswrapper[4555]: I1209 15:15:38.351035 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2gln\" (UniqueName: \"kubernetes.io/projected/d70d9237-a23b-4955-9f6d-1dee064186ff-kube-api-access-t2gln\") pod \"openstack-operator-index-dqv6m\" (UID: \"d70d9237-a23b-4955-9f6d-1dee064186ff\") " pod="openstack-operators/openstack-operator-index-dqv6m" Dec 09 15:15:38 crc kubenswrapper[4555]: I1209 15:15:38.452549 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2gln\" (UniqueName: \"kubernetes.io/projected/d70d9237-a23b-4955-9f6d-1dee064186ff-kube-api-access-t2gln\") pod \"openstack-operator-index-dqv6m\" (UID: \"d70d9237-a23b-4955-9f6d-1dee064186ff\") " pod="openstack-operators/openstack-operator-index-dqv6m" Dec 09 15:15:38 crc kubenswrapper[4555]: I1209 15:15:38.474782 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2gln\" (UniqueName: \"kubernetes.io/projected/d70d9237-a23b-4955-9f6d-1dee064186ff-kube-api-access-t2gln\") pod \"openstack-operator-index-dqv6m\" (UID: \"d70d9237-a23b-4955-9f6d-1dee064186ff\") " pod="openstack-operators/openstack-operator-index-dqv6m" Dec 09 15:15:38 crc kubenswrapper[4555]: I1209 15:15:38.626106 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dqv6m" Dec 09 15:15:38 crc kubenswrapper[4555]: I1209 15:15:38.966753 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dqv6m"] Dec 09 15:15:38 crc kubenswrapper[4555]: W1209 15:15:38.971340 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd70d9237_a23b_4955_9f6d_1dee064186ff.slice/crio-722205b382fa29d72bda6d956dc4f37da436be416acc4e335a21024c00852065 WatchSource:0}: Error finding container 722205b382fa29d72bda6d956dc4f37da436be416acc4e335a21024c00852065: Status 404 returned error can't find the container with id 722205b382fa29d72bda6d956dc4f37da436be416acc4e335a21024c00852065 Dec 09 15:15:39 crc kubenswrapper[4555]: I1209 15:15:39.485457 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dqv6m" event={"ID":"d70d9237-a23b-4955-9f6d-1dee064186ff","Type":"ContainerStarted","Data":"722205b382fa29d72bda6d956dc4f37da436be416acc4e335a21024c00852065"} Dec 09 15:15:40 crc kubenswrapper[4555]: I1209 15:15:40.770640 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-z7j4m" Dec 09 15:15:41 crc kubenswrapper[4555]: I1209 15:15:41.375663 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-kkbk6" Dec 09 15:16:25 crc kubenswrapper[4555]: I1209 15:16:25.531379 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:16:25 crc kubenswrapper[4555]: I1209 15:16:25.531757 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:16:55 crc kubenswrapper[4555]: I1209 15:16:55.531477 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:16:55 crc kubenswrapper[4555]: I1209 15:16:55.531762 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:17:25 crc kubenswrapper[4555]: I1209 15:17:25.531432 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:17:25 crc kubenswrapper[4555]: I1209 15:17:25.531836 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:17:25 crc kubenswrapper[4555]: I1209 15:17:25.531884 4555 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" Dec 09 15:17:25 crc kubenswrapper[4555]: I1209 15:17:25.532522 4555 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"80bba19695acae3b426d36d271b4edb8798293963becf54942b939c9706ad20f"} pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 15:17:25 crc kubenswrapper[4555]: I1209 15:17:25.532573 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" containerID="cri-o://80bba19695acae3b426d36d271b4edb8798293963becf54942b939c9706ad20f" gracePeriod=600 Dec 09 15:17:26 crc kubenswrapper[4555]: I1209 15:17:26.075028 4555 generic.go:334] "Generic (PLEG): container finished" podID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerID="80bba19695acae3b426d36d271b4edb8798293963becf54942b939c9706ad20f" exitCode=0 Dec 09 15:17:26 crc kubenswrapper[4555]: I1209 15:17:26.075058 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" event={"ID":"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36","Type":"ContainerDied","Data":"80bba19695acae3b426d36d271b4edb8798293963becf54942b939c9706ad20f"} Dec 09 15:17:26 crc kubenswrapper[4555]: I1209 15:17:26.075279 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" event={"ID":"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36","Type":"ContainerStarted","Data":"619ed8e012f51e2e3bee9f7c1b673ad5f78a54af986c2ba1794a006ee7a95de4"} Dec 09 15:17:26 crc kubenswrapper[4555]: I1209 15:17:26.075317 4555 scope.go:117] "RemoveContainer" containerID="fa5fc33cd5d536ecfcb65483f7919b247963e43ee3446fdfb5d114321c1858f9" Dec 09 15:17:35 crc kubenswrapper[4555]: E1209 15:17:35.196741 4555 log.go:32] "PullImage from image service failed" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" image="38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05" Dec 09 15:17:35 crc kubenswrapper[4555]: E1209 15:17:35.197171 4555 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" image="38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05" Dec 09 15:17:35 crc kubenswrapper[4555]: E1209 15:17:35.197351 4555 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-54j7p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-operator-index-z2n5j_openstack-operators(70577dfe-35b4-46d4-bccf-59d719879da2): ErrImagePull: rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" logger="UnhandledError" Dec 09 15:17:35 crc kubenswrapper[4555]: E1209 15:17:35.199089 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \\\"http://38.102.83.5:5001/v2/\\\": dial tcp 38.102.83.5:5001: i/o timeout\"" pod="openstack-operators/openstack-operator-index-z2n5j" podUID="70577dfe-35b4-46d4-bccf-59d719879da2" Dec 09 15:17:36 crc kubenswrapper[4555]: I1209 15:17:36.345878 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-z2n5j" Dec 09 15:17:36 crc kubenswrapper[4555]: I1209 15:17:36.510523 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54j7p\" (UniqueName: \"kubernetes.io/projected/70577dfe-35b4-46d4-bccf-59d719879da2-kube-api-access-54j7p\") pod \"70577dfe-35b4-46d4-bccf-59d719879da2\" (UID: \"70577dfe-35b4-46d4-bccf-59d719879da2\") " Dec 09 15:17:36 crc kubenswrapper[4555]: I1209 15:17:36.515017 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70577dfe-35b4-46d4-bccf-59d719879da2-kube-api-access-54j7p" (OuterVolumeSpecName: "kube-api-access-54j7p") pod "70577dfe-35b4-46d4-bccf-59d719879da2" (UID: "70577dfe-35b4-46d4-bccf-59d719879da2"). InnerVolumeSpecName "kube-api-access-54j7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:17:36 crc kubenswrapper[4555]: I1209 15:17:36.611914 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54j7p\" (UniqueName: \"kubernetes.io/projected/70577dfe-35b4-46d4-bccf-59d719879da2-kube-api-access-54j7p\") on node \"crc\" DevicePath \"\"" Dec 09 15:17:37 crc kubenswrapper[4555]: I1209 15:17:37.135178 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-z2n5j" event={"ID":"70577dfe-35b4-46d4-bccf-59d719879da2","Type":"ContainerDied","Data":"16426a3220c0a4000155862b386e50fbdb7af982825d1ed84f5095b01ec1bb33"} Dec 09 15:17:37 crc kubenswrapper[4555]: I1209 15:17:37.135196 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-z2n5j" Dec 09 15:17:37 crc kubenswrapper[4555]: I1209 15:17:37.169463 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-z2n5j"] Dec 09 15:17:37 crc kubenswrapper[4555]: I1209 15:17:37.172134 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-z2n5j"] Dec 09 15:17:38 crc kubenswrapper[4555]: I1209 15:17:38.167646 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70577dfe-35b4-46d4-bccf-59d719879da2" path="/var/lib/kubelet/pods/70577dfe-35b4-46d4-bccf-59d719879da2/volumes" Dec 09 15:17:38 crc kubenswrapper[4555]: E1209 15:17:38.977516 4555 log.go:32] "PullImage from image service failed" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" image="38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05" Dec 09 15:17:38 crc kubenswrapper[4555]: E1209 15:17:38.977749 4555 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" image="38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05" Dec 09 15:17:38 crc kubenswrapper[4555]: E1209 15:17:38.977863 4555 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t2gln,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-operator-index-dqv6m_openstack-operators(d70d9237-a23b-4955-9f6d-1dee064186ff): ErrImagePull: rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" logger="UnhandledError" Dec 09 15:17:38 crc kubenswrapper[4555]: E1209 15:17:38.979041 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \\\"http://38.102.83.5:5001/v2/\\\": dial tcp 38.102.83.5:5001: i/o timeout\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:17:39 crc kubenswrapper[4555]: E1209 15:17:39.150231 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:19:25 crc kubenswrapper[4555]: I1209 15:19:25.531843 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:19:25 crc kubenswrapper[4555]: I1209 15:19:25.532754 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:19:51 crc kubenswrapper[4555]: E1209 15:19:51.165774 4555 log.go:32] "PullImage from image service failed" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" image="38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05" Dec 09 15:19:51 crc kubenswrapper[4555]: E1209 15:19:51.166147 4555 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" image="38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05" Dec 09 15:19:51 crc kubenswrapper[4555]: E1209 15:19:51.166267 4555 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t2gln,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-operator-index-dqv6m_openstack-operators(d70d9237-a23b-4955-9f6d-1dee064186ff): ErrImagePull: rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" logger="UnhandledError" Dec 09 15:19:51 crc kubenswrapper[4555]: E1209 15:19:51.167426 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \\\"http://38.102.83.5:5001/v2/\\\": dial tcp 38.102.83.5:5001: i/o timeout\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:19:55 crc kubenswrapper[4555]: I1209 15:19:55.531359 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:19:55 crc kubenswrapper[4555]: I1209 15:19:55.531730 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:20:03 crc kubenswrapper[4555]: E1209 15:20:03.162317 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:20:16 crc kubenswrapper[4555]: I1209 15:20:16.162850 4555 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 15:20:25 crc kubenswrapper[4555]: I1209 15:20:25.531010 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:20:25 crc kubenswrapper[4555]: I1209 15:20:25.531359 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:20:25 crc kubenswrapper[4555]: I1209 15:20:25.531396 4555 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" Dec 09 15:20:25 crc kubenswrapper[4555]: I1209 15:20:25.532136 4555 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"619ed8e012f51e2e3bee9f7c1b673ad5f78a54af986c2ba1794a006ee7a95de4"} pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 15:20:25 crc kubenswrapper[4555]: I1209 15:20:25.532188 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" containerID="cri-o://619ed8e012f51e2e3bee9f7c1b673ad5f78a54af986c2ba1794a006ee7a95de4" gracePeriod=600 Dec 09 15:20:26 crc kubenswrapper[4555]: I1209 15:20:26.058427 4555 generic.go:334] "Generic (PLEG): container finished" podID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerID="619ed8e012f51e2e3bee9f7c1b673ad5f78a54af986c2ba1794a006ee7a95de4" exitCode=0 Dec 09 15:20:26 crc kubenswrapper[4555]: I1209 15:20:26.058506 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" event={"ID":"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36","Type":"ContainerDied","Data":"619ed8e012f51e2e3bee9f7c1b673ad5f78a54af986c2ba1794a006ee7a95de4"} Dec 09 15:20:26 crc kubenswrapper[4555]: I1209 15:20:26.058617 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" event={"ID":"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36","Type":"ContainerStarted","Data":"167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818"} Dec 09 15:20:26 crc kubenswrapper[4555]: I1209 15:20:26.058637 4555 scope.go:117] "RemoveContainer" containerID="80bba19695acae3b426d36d271b4edb8798293963becf54942b939c9706ad20f" Dec 09 15:22:16 crc kubenswrapper[4555]: E1209 15:22:16.166548 4555 log.go:32] "PullImage from image service failed" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" image="38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05" Dec 09 15:22:16 crc kubenswrapper[4555]: E1209 15:22:16.166928 4555 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" image="38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05" Dec 09 15:22:16 crc kubenswrapper[4555]: E1209 15:22:16.167036 4555 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t2gln,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-operator-index-dqv6m_openstack-operators(d70d9237-a23b-4955-9f6d-1dee064186ff): ErrImagePull: rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" logger="UnhandledError" Dec 09 15:22:16 crc kubenswrapper[4555]: E1209 15:22:16.168187 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \\\"http://38.102.83.5:5001/v2/\\\": dial tcp 38.102.83.5:5001: i/o timeout\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:22:25 crc kubenswrapper[4555]: I1209 15:22:25.531069 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:22:25 crc kubenswrapper[4555]: I1209 15:22:25.531399 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:22:30 crc kubenswrapper[4555]: E1209 15:22:30.165060 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:22:42 crc kubenswrapper[4555]: E1209 15:22:42.162423 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:22:43 crc kubenswrapper[4555]: I1209 15:22:43.677812 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6bcdh"] Dec 09 15:22:43 crc kubenswrapper[4555]: I1209 15:22:43.680418 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6bcdh" Dec 09 15:22:43 crc kubenswrapper[4555]: I1209 15:22:43.684285 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6bcdh"] Dec 09 15:22:43 crc kubenswrapper[4555]: I1209 15:22:43.711836 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e64e1dab-a56e-4f85-8656-13014c85e46a-catalog-content\") pod \"certified-operators-6bcdh\" (UID: \"e64e1dab-a56e-4f85-8656-13014c85e46a\") " pod="openshift-marketplace/certified-operators-6bcdh" Dec 09 15:22:43 crc kubenswrapper[4555]: I1209 15:22:43.711896 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e64e1dab-a56e-4f85-8656-13014c85e46a-utilities\") pod \"certified-operators-6bcdh\" (UID: \"e64e1dab-a56e-4f85-8656-13014c85e46a\") " pod="openshift-marketplace/certified-operators-6bcdh" Dec 09 15:22:43 crc kubenswrapper[4555]: I1209 15:22:43.711946 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx5wm\" (UniqueName: \"kubernetes.io/projected/e64e1dab-a56e-4f85-8656-13014c85e46a-kube-api-access-kx5wm\") pod \"certified-operators-6bcdh\" (UID: \"e64e1dab-a56e-4f85-8656-13014c85e46a\") " pod="openshift-marketplace/certified-operators-6bcdh" Dec 09 15:22:43 crc kubenswrapper[4555]: I1209 15:22:43.813336 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e64e1dab-a56e-4f85-8656-13014c85e46a-catalog-content\") pod \"certified-operators-6bcdh\" (UID: \"e64e1dab-a56e-4f85-8656-13014c85e46a\") " pod="openshift-marketplace/certified-operators-6bcdh" Dec 09 15:22:43 crc kubenswrapper[4555]: I1209 15:22:43.813385 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e64e1dab-a56e-4f85-8656-13014c85e46a-utilities\") pod \"certified-operators-6bcdh\" (UID: \"e64e1dab-a56e-4f85-8656-13014c85e46a\") " pod="openshift-marketplace/certified-operators-6bcdh" Dec 09 15:22:43 crc kubenswrapper[4555]: I1209 15:22:43.813411 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx5wm\" (UniqueName: \"kubernetes.io/projected/e64e1dab-a56e-4f85-8656-13014c85e46a-kube-api-access-kx5wm\") pod \"certified-operators-6bcdh\" (UID: \"e64e1dab-a56e-4f85-8656-13014c85e46a\") " pod="openshift-marketplace/certified-operators-6bcdh" Dec 09 15:22:43 crc kubenswrapper[4555]: I1209 15:22:43.813827 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e64e1dab-a56e-4f85-8656-13014c85e46a-catalog-content\") pod \"certified-operators-6bcdh\" (UID: \"e64e1dab-a56e-4f85-8656-13014c85e46a\") " pod="openshift-marketplace/certified-operators-6bcdh" Dec 09 15:22:43 crc kubenswrapper[4555]: I1209 15:22:43.813885 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e64e1dab-a56e-4f85-8656-13014c85e46a-utilities\") pod \"certified-operators-6bcdh\" (UID: \"e64e1dab-a56e-4f85-8656-13014c85e46a\") " pod="openshift-marketplace/certified-operators-6bcdh" Dec 09 15:22:43 crc kubenswrapper[4555]: I1209 15:22:43.828883 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx5wm\" (UniqueName: \"kubernetes.io/projected/e64e1dab-a56e-4f85-8656-13014c85e46a-kube-api-access-kx5wm\") pod \"certified-operators-6bcdh\" (UID: \"e64e1dab-a56e-4f85-8656-13014c85e46a\") " pod="openshift-marketplace/certified-operators-6bcdh" Dec 09 15:22:43 crc kubenswrapper[4555]: I1209 15:22:43.999041 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6bcdh" Dec 09 15:22:44 crc kubenswrapper[4555]: I1209 15:22:44.384809 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6bcdh"] Dec 09 15:22:44 crc kubenswrapper[4555]: I1209 15:22:44.785960 4555 generic.go:334] "Generic (PLEG): container finished" podID="e64e1dab-a56e-4f85-8656-13014c85e46a" containerID="f3bb6efc530640ca1dc453dff8e5c4566207e509be5ffd8547c2a36c7653cfa4" exitCode=0 Dec 09 15:22:44 crc kubenswrapper[4555]: I1209 15:22:44.786007 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6bcdh" event={"ID":"e64e1dab-a56e-4f85-8656-13014c85e46a","Type":"ContainerDied","Data":"f3bb6efc530640ca1dc453dff8e5c4566207e509be5ffd8547c2a36c7653cfa4"} Dec 09 15:22:44 crc kubenswrapper[4555]: I1209 15:22:44.786177 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6bcdh" event={"ID":"e64e1dab-a56e-4f85-8656-13014c85e46a","Type":"ContainerStarted","Data":"e109acf72b8b73f2f96cb7914bdfc27e058f95405a7df1ad499f3472f3cd24de"} Dec 09 15:22:45 crc kubenswrapper[4555]: I1209 15:22:45.792665 4555 generic.go:334] "Generic (PLEG): container finished" podID="e64e1dab-a56e-4f85-8656-13014c85e46a" containerID="f0d19a7ef79f3638740c228534fe833806fd5a2ddbba5cec50e9f9324721336e" exitCode=0 Dec 09 15:22:45 crc kubenswrapper[4555]: I1209 15:22:45.792698 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6bcdh" event={"ID":"e64e1dab-a56e-4f85-8656-13014c85e46a","Type":"ContainerDied","Data":"f0d19a7ef79f3638740c228534fe833806fd5a2ddbba5cec50e9f9324721336e"} Dec 09 15:22:46 crc kubenswrapper[4555]: I1209 15:22:46.800581 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6bcdh" event={"ID":"e64e1dab-a56e-4f85-8656-13014c85e46a","Type":"ContainerStarted","Data":"fc6eb9128675d910d67134dfb948789a824b0667820bf85c23a3bcd63a654b04"} Dec 09 15:22:46 crc kubenswrapper[4555]: I1209 15:22:46.816054 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6bcdh" podStartSLOduration=2.379949145 podStartE2EDuration="3.816038599s" podCreationTimestamp="2025-12-09 15:22:43 +0000 UTC" firstStartedPulling="2025-12-09 15:22:44.787194484 +0000 UTC m=+1474.923907202" lastFinishedPulling="2025-12-09 15:22:46.223283938 +0000 UTC m=+1476.359996656" observedRunningTime="2025-12-09 15:22:46.815271385 +0000 UTC m=+1476.951984103" watchObservedRunningTime="2025-12-09 15:22:46.816038599 +0000 UTC m=+1476.952751316" Dec 09 15:22:54 crc kubenswrapper[4555]: I1209 15:22:53.999715 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6bcdh" Dec 09 15:22:54 crc kubenswrapper[4555]: I1209 15:22:54.000052 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6bcdh" Dec 09 15:22:54 crc kubenswrapper[4555]: I1209 15:22:54.029359 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6bcdh" Dec 09 15:22:54 crc kubenswrapper[4555]: I1209 15:22:54.868236 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6bcdh" Dec 09 15:22:54 crc kubenswrapper[4555]: I1209 15:22:54.899186 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6bcdh"] Dec 09 15:22:55 crc kubenswrapper[4555]: I1209 15:22:55.531227 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:22:55 crc kubenswrapper[4555]: I1209 15:22:55.531278 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:22:56 crc kubenswrapper[4555]: E1209 15:22:56.162732 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:22:56 crc kubenswrapper[4555]: I1209 15:22:56.850332 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6bcdh" podUID="e64e1dab-a56e-4f85-8656-13014c85e46a" containerName="registry-server" containerID="cri-o://fc6eb9128675d910d67134dfb948789a824b0667820bf85c23a3bcd63a654b04" gracePeriod=2 Dec 09 15:22:57 crc kubenswrapper[4555]: I1209 15:22:57.183075 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6bcdh" Dec 09 15:22:57 crc kubenswrapper[4555]: I1209 15:22:57.297223 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx5wm\" (UniqueName: \"kubernetes.io/projected/e64e1dab-a56e-4f85-8656-13014c85e46a-kube-api-access-kx5wm\") pod \"e64e1dab-a56e-4f85-8656-13014c85e46a\" (UID: \"e64e1dab-a56e-4f85-8656-13014c85e46a\") " Dec 09 15:22:57 crc kubenswrapper[4555]: I1209 15:22:57.297491 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e64e1dab-a56e-4f85-8656-13014c85e46a-utilities\") pod \"e64e1dab-a56e-4f85-8656-13014c85e46a\" (UID: \"e64e1dab-a56e-4f85-8656-13014c85e46a\") " Dec 09 15:22:57 crc kubenswrapper[4555]: I1209 15:22:57.297578 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e64e1dab-a56e-4f85-8656-13014c85e46a-catalog-content\") pod \"e64e1dab-a56e-4f85-8656-13014c85e46a\" (UID: \"e64e1dab-a56e-4f85-8656-13014c85e46a\") " Dec 09 15:22:57 crc kubenswrapper[4555]: I1209 15:22:57.298291 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e64e1dab-a56e-4f85-8656-13014c85e46a-utilities" (OuterVolumeSpecName: "utilities") pod "e64e1dab-a56e-4f85-8656-13014c85e46a" (UID: "e64e1dab-a56e-4f85-8656-13014c85e46a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:22:57 crc kubenswrapper[4555]: I1209 15:22:57.301909 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e64e1dab-a56e-4f85-8656-13014c85e46a-kube-api-access-kx5wm" (OuterVolumeSpecName: "kube-api-access-kx5wm") pod "e64e1dab-a56e-4f85-8656-13014c85e46a" (UID: "e64e1dab-a56e-4f85-8656-13014c85e46a"). InnerVolumeSpecName "kube-api-access-kx5wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:22:57 crc kubenswrapper[4555]: I1209 15:22:57.330153 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e64e1dab-a56e-4f85-8656-13014c85e46a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e64e1dab-a56e-4f85-8656-13014c85e46a" (UID: "e64e1dab-a56e-4f85-8656-13014c85e46a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:22:57 crc kubenswrapper[4555]: I1209 15:22:57.398878 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx5wm\" (UniqueName: \"kubernetes.io/projected/e64e1dab-a56e-4f85-8656-13014c85e46a-kube-api-access-kx5wm\") on node \"crc\" DevicePath \"\"" Dec 09 15:22:57 crc kubenswrapper[4555]: I1209 15:22:57.398900 4555 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e64e1dab-a56e-4f85-8656-13014c85e46a-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:22:57 crc kubenswrapper[4555]: I1209 15:22:57.398909 4555 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e64e1dab-a56e-4f85-8656-13014c85e46a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:22:57 crc kubenswrapper[4555]: I1209 15:22:57.857179 4555 generic.go:334] "Generic (PLEG): container finished" podID="e64e1dab-a56e-4f85-8656-13014c85e46a" containerID="fc6eb9128675d910d67134dfb948789a824b0667820bf85c23a3bcd63a654b04" exitCode=0 Dec 09 15:22:57 crc kubenswrapper[4555]: I1209 15:22:57.857205 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6bcdh" event={"ID":"e64e1dab-a56e-4f85-8656-13014c85e46a","Type":"ContainerDied","Data":"fc6eb9128675d910d67134dfb948789a824b0667820bf85c23a3bcd63a654b04"} Dec 09 15:22:57 crc kubenswrapper[4555]: I1209 15:22:57.857233 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6bcdh" Dec 09 15:22:57 crc kubenswrapper[4555]: I1209 15:22:57.857246 4555 scope.go:117] "RemoveContainer" containerID="fc6eb9128675d910d67134dfb948789a824b0667820bf85c23a3bcd63a654b04" Dec 09 15:22:57 crc kubenswrapper[4555]: I1209 15:22:57.857237 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6bcdh" event={"ID":"e64e1dab-a56e-4f85-8656-13014c85e46a","Type":"ContainerDied","Data":"e109acf72b8b73f2f96cb7914bdfc27e058f95405a7df1ad499f3472f3cd24de"} Dec 09 15:22:57 crc kubenswrapper[4555]: I1209 15:22:57.870934 4555 scope.go:117] "RemoveContainer" containerID="f0d19a7ef79f3638740c228534fe833806fd5a2ddbba5cec50e9f9324721336e" Dec 09 15:22:57 crc kubenswrapper[4555]: I1209 15:22:57.878484 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6bcdh"] Dec 09 15:22:57 crc kubenswrapper[4555]: I1209 15:22:57.884101 4555 scope.go:117] "RemoveContainer" containerID="f3bb6efc530640ca1dc453dff8e5c4566207e509be5ffd8547c2a36c7653cfa4" Dec 09 15:22:57 crc kubenswrapper[4555]: I1209 15:22:57.885922 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6bcdh"] Dec 09 15:22:57 crc kubenswrapper[4555]: I1209 15:22:57.904117 4555 scope.go:117] "RemoveContainer" containerID="fc6eb9128675d910d67134dfb948789a824b0667820bf85c23a3bcd63a654b04" Dec 09 15:22:57 crc kubenswrapper[4555]: E1209 15:22:57.904497 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc6eb9128675d910d67134dfb948789a824b0667820bf85c23a3bcd63a654b04\": container with ID starting with fc6eb9128675d910d67134dfb948789a824b0667820bf85c23a3bcd63a654b04 not found: ID does not exist" containerID="fc6eb9128675d910d67134dfb948789a824b0667820bf85c23a3bcd63a654b04" Dec 09 15:22:57 crc kubenswrapper[4555]: I1209 15:22:57.904525 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc6eb9128675d910d67134dfb948789a824b0667820bf85c23a3bcd63a654b04"} err="failed to get container status \"fc6eb9128675d910d67134dfb948789a824b0667820bf85c23a3bcd63a654b04\": rpc error: code = NotFound desc = could not find container \"fc6eb9128675d910d67134dfb948789a824b0667820bf85c23a3bcd63a654b04\": container with ID starting with fc6eb9128675d910d67134dfb948789a824b0667820bf85c23a3bcd63a654b04 not found: ID does not exist" Dec 09 15:22:57 crc kubenswrapper[4555]: I1209 15:22:57.904544 4555 scope.go:117] "RemoveContainer" containerID="f0d19a7ef79f3638740c228534fe833806fd5a2ddbba5cec50e9f9324721336e" Dec 09 15:22:57 crc kubenswrapper[4555]: E1209 15:22:57.904774 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0d19a7ef79f3638740c228534fe833806fd5a2ddbba5cec50e9f9324721336e\": container with ID starting with f0d19a7ef79f3638740c228534fe833806fd5a2ddbba5cec50e9f9324721336e not found: ID does not exist" containerID="f0d19a7ef79f3638740c228534fe833806fd5a2ddbba5cec50e9f9324721336e" Dec 09 15:22:57 crc kubenswrapper[4555]: I1209 15:22:57.904804 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0d19a7ef79f3638740c228534fe833806fd5a2ddbba5cec50e9f9324721336e"} err="failed to get container status \"f0d19a7ef79f3638740c228534fe833806fd5a2ddbba5cec50e9f9324721336e\": rpc error: code = NotFound desc = could not find container \"f0d19a7ef79f3638740c228534fe833806fd5a2ddbba5cec50e9f9324721336e\": container with ID starting with f0d19a7ef79f3638740c228534fe833806fd5a2ddbba5cec50e9f9324721336e not found: ID does not exist" Dec 09 15:22:57 crc kubenswrapper[4555]: I1209 15:22:57.904825 4555 scope.go:117] "RemoveContainer" containerID="f3bb6efc530640ca1dc453dff8e5c4566207e509be5ffd8547c2a36c7653cfa4" Dec 09 15:22:57 crc kubenswrapper[4555]: E1209 15:22:57.905042 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3bb6efc530640ca1dc453dff8e5c4566207e509be5ffd8547c2a36c7653cfa4\": container with ID starting with f3bb6efc530640ca1dc453dff8e5c4566207e509be5ffd8547c2a36c7653cfa4 not found: ID does not exist" containerID="f3bb6efc530640ca1dc453dff8e5c4566207e509be5ffd8547c2a36c7653cfa4" Dec 09 15:22:57 crc kubenswrapper[4555]: I1209 15:22:57.905111 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3bb6efc530640ca1dc453dff8e5c4566207e509be5ffd8547c2a36c7653cfa4"} err="failed to get container status \"f3bb6efc530640ca1dc453dff8e5c4566207e509be5ffd8547c2a36c7653cfa4\": rpc error: code = NotFound desc = could not find container \"f3bb6efc530640ca1dc453dff8e5c4566207e509be5ffd8547c2a36c7653cfa4\": container with ID starting with f3bb6efc530640ca1dc453dff8e5c4566207e509be5ffd8547c2a36c7653cfa4 not found: ID does not exist" Dec 09 15:22:58 crc kubenswrapper[4555]: I1209 15:22:58.167647 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e64e1dab-a56e-4f85-8656-13014c85e46a" path="/var/lib/kubelet/pods/e64e1dab-a56e-4f85-8656-13014c85e46a/volumes" Dec 09 15:23:25 crc kubenswrapper[4555]: I1209 15:23:25.531397 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:23:25 crc kubenswrapper[4555]: I1209 15:23:25.531771 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:23:25 crc kubenswrapper[4555]: I1209 15:23:25.531803 4555 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" Dec 09 15:23:25 crc kubenswrapper[4555]: I1209 15:23:25.532325 4555 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818"} pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 15:23:25 crc kubenswrapper[4555]: I1209 15:23:25.532373 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" containerID="cri-o://167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818" gracePeriod=600 Dec 09 15:23:25 crc kubenswrapper[4555]: E1209 15:23:25.648162 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:23:26 crc kubenswrapper[4555]: I1209 15:23:26.006958 4555 generic.go:334] "Generic (PLEG): container finished" podID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerID="167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818" exitCode=0 Dec 09 15:23:26 crc kubenswrapper[4555]: I1209 15:23:26.006991 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" event={"ID":"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36","Type":"ContainerDied","Data":"167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818"} Dec 09 15:23:26 crc kubenswrapper[4555]: I1209 15:23:26.007020 4555 scope.go:117] "RemoveContainer" containerID="619ed8e012f51e2e3bee9f7c1b673ad5f78a54af986c2ba1794a006ee7a95de4" Dec 09 15:23:26 crc kubenswrapper[4555]: I1209 15:23:26.007630 4555 scope.go:117] "RemoveContainer" containerID="167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818" Dec 09 15:23:26 crc kubenswrapper[4555]: E1209 15:23:26.008997 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:23:41 crc kubenswrapper[4555]: I1209 15:23:41.161448 4555 scope.go:117] "RemoveContainer" containerID="167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818" Dec 09 15:23:41 crc kubenswrapper[4555]: E1209 15:23:41.162957 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:23:53 crc kubenswrapper[4555]: I1209 15:23:53.160698 4555 scope.go:117] "RemoveContainer" containerID="167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818" Dec 09 15:23:53 crc kubenswrapper[4555]: E1209 15:23:53.161235 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:24:04 crc kubenswrapper[4555]: I1209 15:24:04.160763 4555 scope.go:117] "RemoveContainer" containerID="167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818" Dec 09 15:24:04 crc kubenswrapper[4555]: E1209 15:24:04.161400 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:24:17 crc kubenswrapper[4555]: I1209 15:24:17.161495 4555 scope.go:117] "RemoveContainer" containerID="167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818" Dec 09 15:24:17 crc kubenswrapper[4555]: E1209 15:24:17.162061 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:24:28 crc kubenswrapper[4555]: I1209 15:24:28.161188 4555 scope.go:117] "RemoveContainer" containerID="167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818" Dec 09 15:24:28 crc kubenswrapper[4555]: E1209 15:24:28.162027 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:24:42 crc kubenswrapper[4555]: I1209 15:24:42.161869 4555 scope.go:117] "RemoveContainer" containerID="167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818" Dec 09 15:24:42 crc kubenswrapper[4555]: E1209 15:24:42.162442 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:24:57 crc kubenswrapper[4555]: I1209 15:24:57.165253 4555 scope.go:117] "RemoveContainer" containerID="167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818" Dec 09 15:24:57 crc kubenswrapper[4555]: E1209 15:24:57.166007 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:25:11 crc kubenswrapper[4555]: E1209 15:25:11.169373 4555 log.go:32] "PullImage from image service failed" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" image="38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05" Dec 09 15:25:11 crc kubenswrapper[4555]: E1209 15:25:11.169752 4555 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" image="38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05" Dec 09 15:25:11 crc kubenswrapper[4555]: E1209 15:25:11.169847 4555 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t2gln,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-operator-index-dqv6m_openstack-operators(d70d9237-a23b-4955-9f6d-1dee064186ff): ErrImagePull: rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" logger="UnhandledError" Dec 09 15:25:11 crc kubenswrapper[4555]: E1209 15:25:11.171027 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \\\"http://38.102.83.5:5001/v2/\\\": dial tcp 38.102.83.5:5001: i/o timeout\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:25:12 crc kubenswrapper[4555]: I1209 15:25:12.161175 4555 scope.go:117] "RemoveContainer" containerID="167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818" Dec 09 15:25:12 crc kubenswrapper[4555]: E1209 15:25:12.161497 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:25:24 crc kubenswrapper[4555]: I1209 15:25:24.160702 4555 scope.go:117] "RemoveContainer" containerID="167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818" Dec 09 15:25:24 crc kubenswrapper[4555]: E1209 15:25:24.161182 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:25:26 crc kubenswrapper[4555]: E1209 15:25:26.161812 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:25:38 crc kubenswrapper[4555]: E1209 15:25:38.163067 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:25:39 crc kubenswrapper[4555]: I1209 15:25:39.160805 4555 scope.go:117] "RemoveContainer" containerID="167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818" Dec 09 15:25:39 crc kubenswrapper[4555]: E1209 15:25:39.161155 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:25:52 crc kubenswrapper[4555]: I1209 15:25:52.161380 4555 scope.go:117] "RemoveContainer" containerID="167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818" Dec 09 15:25:52 crc kubenswrapper[4555]: E1209 15:25:52.161993 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:25:53 crc kubenswrapper[4555]: E1209 15:25:53.162615 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:26:04 crc kubenswrapper[4555]: I1209 15:26:04.161150 4555 scope.go:117] "RemoveContainer" containerID="167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818" Dec 09 15:26:04 crc kubenswrapper[4555]: E1209 15:26:04.161696 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:26:05 crc kubenswrapper[4555]: E1209 15:26:05.162663 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:26:17 crc kubenswrapper[4555]: E1209 15:26:17.163055 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:26:18 crc kubenswrapper[4555]: I1209 15:26:18.161269 4555 scope.go:117] "RemoveContainer" containerID="167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818" Dec 09 15:26:18 crc kubenswrapper[4555]: E1209 15:26:18.161845 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:26:31 crc kubenswrapper[4555]: I1209 15:26:31.161954 4555 scope.go:117] "RemoveContainer" containerID="167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818" Dec 09 15:26:31 crc kubenswrapper[4555]: E1209 15:26:31.162451 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:26:31 crc kubenswrapper[4555]: E1209 15:26:31.162751 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:26:43 crc kubenswrapper[4555]: I1209 15:26:43.161579 4555 scope.go:117] "RemoveContainer" containerID="167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818" Dec 09 15:26:43 crc kubenswrapper[4555]: E1209 15:26:43.162215 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:26:46 crc kubenswrapper[4555]: I1209 15:26:46.162496 4555 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 15:26:49 crc kubenswrapper[4555]: I1209 15:26:49.452043 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sqk9k"] Dec 09 15:26:49 crc kubenswrapper[4555]: E1209 15:26:49.452541 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e64e1dab-a56e-4f85-8656-13014c85e46a" containerName="extract-utilities" Dec 09 15:26:49 crc kubenswrapper[4555]: I1209 15:26:49.452554 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="e64e1dab-a56e-4f85-8656-13014c85e46a" containerName="extract-utilities" Dec 09 15:26:49 crc kubenswrapper[4555]: E1209 15:26:49.452571 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e64e1dab-a56e-4f85-8656-13014c85e46a" containerName="extract-content" Dec 09 15:26:49 crc kubenswrapper[4555]: I1209 15:26:49.452576 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="e64e1dab-a56e-4f85-8656-13014c85e46a" containerName="extract-content" Dec 09 15:26:49 crc kubenswrapper[4555]: E1209 15:26:49.452591 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e64e1dab-a56e-4f85-8656-13014c85e46a" containerName="registry-server" Dec 09 15:26:49 crc kubenswrapper[4555]: I1209 15:26:49.452596 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="e64e1dab-a56e-4f85-8656-13014c85e46a" containerName="registry-server" Dec 09 15:26:49 crc kubenswrapper[4555]: I1209 15:26:49.452745 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="e64e1dab-a56e-4f85-8656-13014c85e46a" containerName="registry-server" Dec 09 15:26:49 crc kubenswrapper[4555]: I1209 15:26:49.453679 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sqk9k" Dec 09 15:26:49 crc kubenswrapper[4555]: I1209 15:26:49.456889 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sqk9k"] Dec 09 15:26:49 crc kubenswrapper[4555]: I1209 15:26:49.602040 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3b2ab95-c092-4739-8cbc-02e9dd6de6e7-catalog-content\") pod \"redhat-operators-sqk9k\" (UID: \"f3b2ab95-c092-4739-8cbc-02e9dd6de6e7\") " pod="openshift-marketplace/redhat-operators-sqk9k" Dec 09 15:26:49 crc kubenswrapper[4555]: I1209 15:26:49.602281 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3b2ab95-c092-4739-8cbc-02e9dd6de6e7-utilities\") pod \"redhat-operators-sqk9k\" (UID: \"f3b2ab95-c092-4739-8cbc-02e9dd6de6e7\") " pod="openshift-marketplace/redhat-operators-sqk9k" Dec 09 15:26:49 crc kubenswrapper[4555]: I1209 15:26:49.602345 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xttwb\" (UniqueName: \"kubernetes.io/projected/f3b2ab95-c092-4739-8cbc-02e9dd6de6e7-kube-api-access-xttwb\") pod \"redhat-operators-sqk9k\" (UID: \"f3b2ab95-c092-4739-8cbc-02e9dd6de6e7\") " pod="openshift-marketplace/redhat-operators-sqk9k" Dec 09 15:26:49 crc kubenswrapper[4555]: I1209 15:26:49.703678 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xttwb\" (UniqueName: \"kubernetes.io/projected/f3b2ab95-c092-4739-8cbc-02e9dd6de6e7-kube-api-access-xttwb\") pod \"redhat-operators-sqk9k\" (UID: \"f3b2ab95-c092-4739-8cbc-02e9dd6de6e7\") " pod="openshift-marketplace/redhat-operators-sqk9k" Dec 09 15:26:49 crc kubenswrapper[4555]: I1209 15:26:49.703776 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3b2ab95-c092-4739-8cbc-02e9dd6de6e7-catalog-content\") pod \"redhat-operators-sqk9k\" (UID: \"f3b2ab95-c092-4739-8cbc-02e9dd6de6e7\") " pod="openshift-marketplace/redhat-operators-sqk9k" Dec 09 15:26:49 crc kubenswrapper[4555]: I1209 15:26:49.703832 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3b2ab95-c092-4739-8cbc-02e9dd6de6e7-utilities\") pod \"redhat-operators-sqk9k\" (UID: \"f3b2ab95-c092-4739-8cbc-02e9dd6de6e7\") " pod="openshift-marketplace/redhat-operators-sqk9k" Dec 09 15:26:49 crc kubenswrapper[4555]: I1209 15:26:49.704177 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3b2ab95-c092-4739-8cbc-02e9dd6de6e7-catalog-content\") pod \"redhat-operators-sqk9k\" (UID: \"f3b2ab95-c092-4739-8cbc-02e9dd6de6e7\") " pod="openshift-marketplace/redhat-operators-sqk9k" Dec 09 15:26:49 crc kubenswrapper[4555]: I1209 15:26:49.704233 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3b2ab95-c092-4739-8cbc-02e9dd6de6e7-utilities\") pod \"redhat-operators-sqk9k\" (UID: \"f3b2ab95-c092-4739-8cbc-02e9dd6de6e7\") " pod="openshift-marketplace/redhat-operators-sqk9k" Dec 09 15:26:49 crc kubenswrapper[4555]: I1209 15:26:49.721199 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xttwb\" (UniqueName: \"kubernetes.io/projected/f3b2ab95-c092-4739-8cbc-02e9dd6de6e7-kube-api-access-xttwb\") pod \"redhat-operators-sqk9k\" (UID: \"f3b2ab95-c092-4739-8cbc-02e9dd6de6e7\") " pod="openshift-marketplace/redhat-operators-sqk9k" Dec 09 15:26:49 crc kubenswrapper[4555]: I1209 15:26:49.795258 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sqk9k" Dec 09 15:26:50 crc kubenswrapper[4555]: I1209 15:26:50.158008 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sqk9k"] Dec 09 15:26:51 crc kubenswrapper[4555]: I1209 15:26:51.105361 4555 generic.go:334] "Generic (PLEG): container finished" podID="f3b2ab95-c092-4739-8cbc-02e9dd6de6e7" containerID="3326831e720ff500ac03ccfaab060b88342dedc0596b2a8d39ed636923625859" exitCode=0 Dec 09 15:26:51 crc kubenswrapper[4555]: I1209 15:26:51.105424 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sqk9k" event={"ID":"f3b2ab95-c092-4739-8cbc-02e9dd6de6e7","Type":"ContainerDied","Data":"3326831e720ff500ac03ccfaab060b88342dedc0596b2a8d39ed636923625859"} Dec 09 15:26:51 crc kubenswrapper[4555]: I1209 15:26:51.105707 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sqk9k" event={"ID":"f3b2ab95-c092-4739-8cbc-02e9dd6de6e7","Type":"ContainerStarted","Data":"2ada2d960ef1dc0e57907207a284487e037b7c4f040c24ecc177e1a0c55da17b"} Dec 09 15:26:56 crc kubenswrapper[4555]: I1209 15:26:56.162082 4555 scope.go:117] "RemoveContainer" containerID="167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818" Dec 09 15:26:56 crc kubenswrapper[4555]: E1209 15:26:56.162727 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:27:00 crc kubenswrapper[4555]: I1209 15:27:00.156753 4555 generic.go:334] "Generic (PLEG): container finished" podID="f3b2ab95-c092-4739-8cbc-02e9dd6de6e7" containerID="ec5c41dd46d4574ac49d8abf4849554405f2254e1fe5f827d6b2cc61c0d2e5b1" exitCode=0 Dec 09 15:27:00 crc kubenswrapper[4555]: I1209 15:27:00.156843 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sqk9k" event={"ID":"f3b2ab95-c092-4739-8cbc-02e9dd6de6e7","Type":"ContainerDied","Data":"ec5c41dd46d4574ac49d8abf4849554405f2254e1fe5f827d6b2cc61c0d2e5b1"} Dec 09 15:27:01 crc kubenswrapper[4555]: I1209 15:27:01.167089 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sqk9k" event={"ID":"f3b2ab95-c092-4739-8cbc-02e9dd6de6e7","Type":"ContainerStarted","Data":"1259b86ffa0ee1df253efe8368db90772700edbe54f8184e552ccfbde5b85e27"} Dec 09 15:27:01 crc kubenswrapper[4555]: I1209 15:27:01.180665 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sqk9k" podStartSLOduration=2.653854125 podStartE2EDuration="12.180634303s" podCreationTimestamp="2025-12-09 15:26:49 +0000 UTC" firstStartedPulling="2025-12-09 15:26:51.106785588 +0000 UTC m=+1721.243498306" lastFinishedPulling="2025-12-09 15:27:00.633565766 +0000 UTC m=+1730.770278484" observedRunningTime="2025-12-09 15:27:01.180205935 +0000 UTC m=+1731.316918653" watchObservedRunningTime="2025-12-09 15:27:01.180634303 +0000 UTC m=+1731.317347021" Dec 09 15:27:09 crc kubenswrapper[4555]: I1209 15:27:09.160719 4555 scope.go:117] "RemoveContainer" containerID="167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818" Dec 09 15:27:09 crc kubenswrapper[4555]: E1209 15:27:09.161336 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:27:09 crc kubenswrapper[4555]: I1209 15:27:09.796186 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sqk9k" Dec 09 15:27:09 crc kubenswrapper[4555]: I1209 15:27:09.796224 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sqk9k" Dec 09 15:27:09 crc kubenswrapper[4555]: I1209 15:27:09.825375 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sqk9k" Dec 09 15:27:10 crc kubenswrapper[4555]: I1209 15:27:10.240491 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sqk9k" Dec 09 15:27:10 crc kubenswrapper[4555]: I1209 15:27:10.284224 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sqk9k"] Dec 09 15:27:10 crc kubenswrapper[4555]: I1209 15:27:10.310959 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fflnp"] Dec 09 15:27:10 crc kubenswrapper[4555]: I1209 15:27:10.311150 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fflnp" podUID="b25e4e41-17e1-4ca8-8761-d953a49b0f22" containerName="registry-server" containerID="cri-o://41770fbe758b546e8c91eda8682c09a3185143bfaf6c639c375c46edf538f8c6" gracePeriod=2 Dec 09 15:27:11 crc kubenswrapper[4555]: I1209 15:27:11.220433 4555 generic.go:334] "Generic (PLEG): container finished" podID="b25e4e41-17e1-4ca8-8761-d953a49b0f22" containerID="41770fbe758b546e8c91eda8682c09a3185143bfaf6c639c375c46edf538f8c6" exitCode=0 Dec 09 15:27:11 crc kubenswrapper[4555]: I1209 15:27:11.220491 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fflnp" event={"ID":"b25e4e41-17e1-4ca8-8761-d953a49b0f22","Type":"ContainerDied","Data":"41770fbe758b546e8c91eda8682c09a3185143bfaf6c639c375c46edf538f8c6"} Dec 09 15:27:12 crc kubenswrapper[4555]: I1209 15:27:12.433742 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fflnp" Dec 09 15:27:12 crc kubenswrapper[4555]: I1209 15:27:12.616797 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b25e4e41-17e1-4ca8-8761-d953a49b0f22-catalog-content\") pod \"b25e4e41-17e1-4ca8-8761-d953a49b0f22\" (UID: \"b25e4e41-17e1-4ca8-8761-d953a49b0f22\") " Dec 09 15:27:12 crc kubenswrapper[4555]: I1209 15:27:12.617028 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5997s\" (UniqueName: \"kubernetes.io/projected/b25e4e41-17e1-4ca8-8761-d953a49b0f22-kube-api-access-5997s\") pod \"b25e4e41-17e1-4ca8-8761-d953a49b0f22\" (UID: \"b25e4e41-17e1-4ca8-8761-d953a49b0f22\") " Dec 09 15:27:12 crc kubenswrapper[4555]: I1209 15:27:12.617135 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b25e4e41-17e1-4ca8-8761-d953a49b0f22-utilities\") pod \"b25e4e41-17e1-4ca8-8761-d953a49b0f22\" (UID: \"b25e4e41-17e1-4ca8-8761-d953a49b0f22\") " Dec 09 15:27:12 crc kubenswrapper[4555]: I1209 15:27:12.618110 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b25e4e41-17e1-4ca8-8761-d953a49b0f22-utilities" (OuterVolumeSpecName: "utilities") pod "b25e4e41-17e1-4ca8-8761-d953a49b0f22" (UID: "b25e4e41-17e1-4ca8-8761-d953a49b0f22"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:27:12 crc kubenswrapper[4555]: I1209 15:27:12.621046 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b25e4e41-17e1-4ca8-8761-d953a49b0f22-kube-api-access-5997s" (OuterVolumeSpecName: "kube-api-access-5997s") pod "b25e4e41-17e1-4ca8-8761-d953a49b0f22" (UID: "b25e4e41-17e1-4ca8-8761-d953a49b0f22"). InnerVolumeSpecName "kube-api-access-5997s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:27:12 crc kubenswrapper[4555]: I1209 15:27:12.692758 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b25e4e41-17e1-4ca8-8761-d953a49b0f22-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b25e4e41-17e1-4ca8-8761-d953a49b0f22" (UID: "b25e4e41-17e1-4ca8-8761-d953a49b0f22"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:27:12 crc kubenswrapper[4555]: I1209 15:27:12.718767 4555 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b25e4e41-17e1-4ca8-8761-d953a49b0f22-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:27:12 crc kubenswrapper[4555]: I1209 15:27:12.718791 4555 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b25e4e41-17e1-4ca8-8761-d953a49b0f22-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:27:12 crc kubenswrapper[4555]: I1209 15:27:12.718802 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5997s\" (UniqueName: \"kubernetes.io/projected/b25e4e41-17e1-4ca8-8761-d953a49b0f22-kube-api-access-5997s\") on node \"crc\" DevicePath \"\"" Dec 09 15:27:13 crc kubenswrapper[4555]: I1209 15:27:13.231391 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fflnp" event={"ID":"b25e4e41-17e1-4ca8-8761-d953a49b0f22","Type":"ContainerDied","Data":"78d20abef45b12f7f833e482ee12c97f9bc67c17569c80e47321da2eb9775e2d"} Dec 09 15:27:13 crc kubenswrapper[4555]: I1209 15:27:13.231431 4555 scope.go:117] "RemoveContainer" containerID="41770fbe758b546e8c91eda8682c09a3185143bfaf6c639c375c46edf538f8c6" Dec 09 15:27:13 crc kubenswrapper[4555]: I1209 15:27:13.231448 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fflnp" Dec 09 15:27:13 crc kubenswrapper[4555]: I1209 15:27:13.247269 4555 scope.go:117] "RemoveContainer" containerID="6c5bcbc7a2374bffc75a184e4a2488d9a4a3429d4474a1c345e94a02c6c9c863" Dec 09 15:27:13 crc kubenswrapper[4555]: I1209 15:27:13.252089 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fflnp"] Dec 09 15:27:13 crc kubenswrapper[4555]: I1209 15:27:13.256016 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fflnp"] Dec 09 15:27:13 crc kubenswrapper[4555]: I1209 15:27:13.271852 4555 scope.go:117] "RemoveContainer" containerID="2ecaf611ae3eea4260273d2ccb5c562bd62e426724688470792a83b962fb012e" Dec 09 15:27:14 crc kubenswrapper[4555]: I1209 15:27:14.170574 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b25e4e41-17e1-4ca8-8761-d953a49b0f22" path="/var/lib/kubelet/pods/b25e4e41-17e1-4ca8-8761-d953a49b0f22/volumes" Dec 09 15:27:22 crc kubenswrapper[4555]: I1209 15:27:22.161452 4555 scope.go:117] "RemoveContainer" containerID="167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818" Dec 09 15:27:22 crc kubenswrapper[4555]: E1209 15:27:22.162084 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:27:37 crc kubenswrapper[4555]: I1209 15:27:37.161042 4555 scope.go:117] "RemoveContainer" containerID="167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818" Dec 09 15:27:37 crc kubenswrapper[4555]: E1209 15:27:37.161553 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:27:52 crc kubenswrapper[4555]: I1209 15:27:52.161877 4555 scope.go:117] "RemoveContainer" containerID="167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818" Dec 09 15:27:52 crc kubenswrapper[4555]: E1209 15:27:52.162532 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:28:07 crc kubenswrapper[4555]: I1209 15:28:07.161130 4555 scope.go:117] "RemoveContainer" containerID="167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818" Dec 09 15:28:07 crc kubenswrapper[4555]: E1209 15:28:07.161689 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:28:18 crc kubenswrapper[4555]: I1209 15:28:18.162486 4555 scope.go:117] "RemoveContainer" containerID="167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818" Dec 09 15:28:18 crc kubenswrapper[4555]: E1209 15:28:18.163830 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:28:33 crc kubenswrapper[4555]: I1209 15:28:33.162487 4555 scope.go:117] "RemoveContainer" containerID="167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818" Dec 09 15:28:33 crc kubenswrapper[4555]: I1209 15:28:33.674519 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" event={"ID":"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36","Type":"ContainerStarted","Data":"e4bc645e4f69b0b3b14ca33644fa6151ef72f410fa63d62a5caa76a2af6d0c75"} Dec 09 15:28:46 crc kubenswrapper[4555]: E1209 15:28:46.165515 4555 log.go:32] "PullImage from image service failed" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" image="38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05" Dec 09 15:28:46 crc kubenswrapper[4555]: E1209 15:28:46.166637 4555 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" image="38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05" Dec 09 15:28:46 crc kubenswrapper[4555]: E1209 15:28:46.166791 4555 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t2gln,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-operator-index-dqv6m_openstack-operators(d70d9237-a23b-4955-9f6d-1dee064186ff): ErrImagePull: rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" logger="UnhandledError" Dec 09 15:28:46 crc kubenswrapper[4555]: E1209 15:28:46.168976 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \\\"http://38.102.83.5:5001/v2/\\\": dial tcp 38.102.83.5:5001: i/o timeout\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:28:58 crc kubenswrapper[4555]: E1209 15:28:58.164272 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:29:10 crc kubenswrapper[4555]: E1209 15:29:10.165690 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:29:21 crc kubenswrapper[4555]: E1209 15:29:21.162087 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:29:35 crc kubenswrapper[4555]: E1209 15:29:35.162856 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:29:48 crc kubenswrapper[4555]: E1209 15:29:48.162390 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:29:59 crc kubenswrapper[4555]: E1209 15:29:59.162450 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:30:00 crc kubenswrapper[4555]: I1209 15:30:00.124282 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421570-gm27p"] Dec 09 15:30:00 crc kubenswrapper[4555]: E1209 15:30:00.124573 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b25e4e41-17e1-4ca8-8761-d953a49b0f22" containerName="registry-server" Dec 09 15:30:00 crc kubenswrapper[4555]: I1209 15:30:00.124584 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="b25e4e41-17e1-4ca8-8761-d953a49b0f22" containerName="registry-server" Dec 09 15:30:00 crc kubenswrapper[4555]: E1209 15:30:00.124607 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b25e4e41-17e1-4ca8-8761-d953a49b0f22" containerName="extract-utilities" Dec 09 15:30:00 crc kubenswrapper[4555]: I1209 15:30:00.124613 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="b25e4e41-17e1-4ca8-8761-d953a49b0f22" containerName="extract-utilities" Dec 09 15:30:00 crc kubenswrapper[4555]: E1209 15:30:00.124624 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b25e4e41-17e1-4ca8-8761-d953a49b0f22" containerName="extract-content" Dec 09 15:30:00 crc kubenswrapper[4555]: I1209 15:30:00.124630 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="b25e4e41-17e1-4ca8-8761-d953a49b0f22" containerName="extract-content" Dec 09 15:30:00 crc kubenswrapper[4555]: I1209 15:30:00.124784 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="b25e4e41-17e1-4ca8-8761-d953a49b0f22" containerName="registry-server" Dec 09 15:30:00 crc kubenswrapper[4555]: I1209 15:30:00.125233 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-gm27p" Dec 09 15:30:00 crc kubenswrapper[4555]: I1209 15:30:00.131070 4555 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 15:30:00 crc kubenswrapper[4555]: I1209 15:30:00.131559 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 15:30:00 crc kubenswrapper[4555]: I1209 15:30:00.133849 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421570-gm27p"] Dec 09 15:30:00 crc kubenswrapper[4555]: I1209 15:30:00.198997 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f3eef2c-77ac-40af-aa09-f3acf4509613-secret-volume\") pod \"collect-profiles-29421570-gm27p\" (UID: \"0f3eef2c-77ac-40af-aa09-f3acf4509613\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-gm27p" Dec 09 15:30:00 crc kubenswrapper[4555]: I1209 15:30:00.199071 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f3eef2c-77ac-40af-aa09-f3acf4509613-config-volume\") pod \"collect-profiles-29421570-gm27p\" (UID: \"0f3eef2c-77ac-40af-aa09-f3acf4509613\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-gm27p" Dec 09 15:30:00 crc kubenswrapper[4555]: I1209 15:30:00.199132 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtffv\" (UniqueName: \"kubernetes.io/projected/0f3eef2c-77ac-40af-aa09-f3acf4509613-kube-api-access-xtffv\") pod \"collect-profiles-29421570-gm27p\" (UID: \"0f3eef2c-77ac-40af-aa09-f3acf4509613\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-gm27p" Dec 09 15:30:00 crc kubenswrapper[4555]: I1209 15:30:00.300688 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtffv\" (UniqueName: \"kubernetes.io/projected/0f3eef2c-77ac-40af-aa09-f3acf4509613-kube-api-access-xtffv\") pod \"collect-profiles-29421570-gm27p\" (UID: \"0f3eef2c-77ac-40af-aa09-f3acf4509613\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-gm27p" Dec 09 15:30:00 crc kubenswrapper[4555]: I1209 15:30:00.300999 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f3eef2c-77ac-40af-aa09-f3acf4509613-secret-volume\") pod \"collect-profiles-29421570-gm27p\" (UID: \"0f3eef2c-77ac-40af-aa09-f3acf4509613\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-gm27p" Dec 09 15:30:00 crc kubenswrapper[4555]: I1209 15:30:00.301156 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f3eef2c-77ac-40af-aa09-f3acf4509613-config-volume\") pod \"collect-profiles-29421570-gm27p\" (UID: \"0f3eef2c-77ac-40af-aa09-f3acf4509613\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-gm27p" Dec 09 15:30:00 crc kubenswrapper[4555]: I1209 15:30:00.301878 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f3eef2c-77ac-40af-aa09-f3acf4509613-config-volume\") pod \"collect-profiles-29421570-gm27p\" (UID: \"0f3eef2c-77ac-40af-aa09-f3acf4509613\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-gm27p" Dec 09 15:30:00 crc kubenswrapper[4555]: I1209 15:30:00.305556 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f3eef2c-77ac-40af-aa09-f3acf4509613-secret-volume\") pod \"collect-profiles-29421570-gm27p\" (UID: \"0f3eef2c-77ac-40af-aa09-f3acf4509613\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-gm27p" Dec 09 15:30:00 crc kubenswrapper[4555]: I1209 15:30:00.314042 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtffv\" (UniqueName: \"kubernetes.io/projected/0f3eef2c-77ac-40af-aa09-f3acf4509613-kube-api-access-xtffv\") pod \"collect-profiles-29421570-gm27p\" (UID: \"0f3eef2c-77ac-40af-aa09-f3acf4509613\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-gm27p" Dec 09 15:30:00 crc kubenswrapper[4555]: I1209 15:30:00.446485 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-gm27p" Dec 09 15:30:00 crc kubenswrapper[4555]: I1209 15:30:00.792375 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421570-gm27p"] Dec 09 15:30:01 crc kubenswrapper[4555]: I1209 15:30:01.143212 4555 generic.go:334] "Generic (PLEG): container finished" podID="0f3eef2c-77ac-40af-aa09-f3acf4509613" containerID="adaeb148ed8a5a1c5f4e14f28388b591ba4fba8bd2826b67c528c4a221f1d041" exitCode=0 Dec 09 15:30:01 crc kubenswrapper[4555]: I1209 15:30:01.143478 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-gm27p" event={"ID":"0f3eef2c-77ac-40af-aa09-f3acf4509613","Type":"ContainerDied","Data":"adaeb148ed8a5a1c5f4e14f28388b591ba4fba8bd2826b67c528c4a221f1d041"} Dec 09 15:30:01 crc kubenswrapper[4555]: I1209 15:30:01.143921 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-gm27p" event={"ID":"0f3eef2c-77ac-40af-aa09-f3acf4509613","Type":"ContainerStarted","Data":"7f79c2eccf1b3ff4530fa1afd38e1671bf42776df75a68b959131c9b19f16e01"} Dec 09 15:30:02 crc kubenswrapper[4555]: I1209 15:30:02.368381 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-gm27p" Dec 09 15:30:02 crc kubenswrapper[4555]: I1209 15:30:02.435685 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtffv\" (UniqueName: \"kubernetes.io/projected/0f3eef2c-77ac-40af-aa09-f3acf4509613-kube-api-access-xtffv\") pod \"0f3eef2c-77ac-40af-aa09-f3acf4509613\" (UID: \"0f3eef2c-77ac-40af-aa09-f3acf4509613\") " Dec 09 15:30:02 crc kubenswrapper[4555]: I1209 15:30:02.435751 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f3eef2c-77ac-40af-aa09-f3acf4509613-secret-volume\") pod \"0f3eef2c-77ac-40af-aa09-f3acf4509613\" (UID: \"0f3eef2c-77ac-40af-aa09-f3acf4509613\") " Dec 09 15:30:02 crc kubenswrapper[4555]: I1209 15:30:02.435771 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f3eef2c-77ac-40af-aa09-f3acf4509613-config-volume\") pod \"0f3eef2c-77ac-40af-aa09-f3acf4509613\" (UID: \"0f3eef2c-77ac-40af-aa09-f3acf4509613\") " Dec 09 15:30:02 crc kubenswrapper[4555]: I1209 15:30:02.436467 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f3eef2c-77ac-40af-aa09-f3acf4509613-config-volume" (OuterVolumeSpecName: "config-volume") pod "0f3eef2c-77ac-40af-aa09-f3acf4509613" (UID: "0f3eef2c-77ac-40af-aa09-f3acf4509613"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:30:02 crc kubenswrapper[4555]: I1209 15:30:02.440383 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f3eef2c-77ac-40af-aa09-f3acf4509613-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0f3eef2c-77ac-40af-aa09-f3acf4509613" (UID: "0f3eef2c-77ac-40af-aa09-f3acf4509613"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:30:02 crc kubenswrapper[4555]: I1209 15:30:02.440465 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f3eef2c-77ac-40af-aa09-f3acf4509613-kube-api-access-xtffv" (OuterVolumeSpecName: "kube-api-access-xtffv") pod "0f3eef2c-77ac-40af-aa09-f3acf4509613" (UID: "0f3eef2c-77ac-40af-aa09-f3acf4509613"). InnerVolumeSpecName "kube-api-access-xtffv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:30:02 crc kubenswrapper[4555]: I1209 15:30:02.536910 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtffv\" (UniqueName: \"kubernetes.io/projected/0f3eef2c-77ac-40af-aa09-f3acf4509613-kube-api-access-xtffv\") on node \"crc\" DevicePath \"\"" Dec 09 15:30:02 crc kubenswrapper[4555]: I1209 15:30:02.536933 4555 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f3eef2c-77ac-40af-aa09-f3acf4509613-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 15:30:02 crc kubenswrapper[4555]: I1209 15:30:02.536941 4555 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f3eef2c-77ac-40af-aa09-f3acf4509613-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 15:30:03 crc kubenswrapper[4555]: I1209 15:30:03.157221 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-gm27p" event={"ID":"0f3eef2c-77ac-40af-aa09-f3acf4509613","Type":"ContainerDied","Data":"7f79c2eccf1b3ff4530fa1afd38e1671bf42776df75a68b959131c9b19f16e01"} Dec 09 15:30:03 crc kubenswrapper[4555]: I1209 15:30:03.157261 4555 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f79c2eccf1b3ff4530fa1afd38e1671bf42776df75a68b959131c9b19f16e01" Dec 09 15:30:03 crc kubenswrapper[4555]: I1209 15:30:03.157282 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-gm27p" Dec 09 15:30:12 crc kubenswrapper[4555]: E1209 15:30:12.162165 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:30:26 crc kubenswrapper[4555]: E1209 15:30:26.162137 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:30:38 crc kubenswrapper[4555]: E1209 15:30:38.162873 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:30:51 crc kubenswrapper[4555]: E1209 15:30:51.162275 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:30:55 crc kubenswrapper[4555]: I1209 15:30:55.531564 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:30:55 crc kubenswrapper[4555]: I1209 15:30:55.531885 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:31:03 crc kubenswrapper[4555]: E1209 15:31:03.163237 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:31:18 crc kubenswrapper[4555]: E1209 15:31:18.162364 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:31:18 crc kubenswrapper[4555]: I1209 15:31:18.264671 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8b7f6/must-gather-sp9dk"] Dec 09 15:31:18 crc kubenswrapper[4555]: E1209 15:31:18.264961 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3eef2c-77ac-40af-aa09-f3acf4509613" containerName="collect-profiles" Dec 09 15:31:18 crc kubenswrapper[4555]: I1209 15:31:18.264977 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3eef2c-77ac-40af-aa09-f3acf4509613" containerName="collect-profiles" Dec 09 15:31:18 crc kubenswrapper[4555]: I1209 15:31:18.265113 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f3eef2c-77ac-40af-aa09-f3acf4509613" containerName="collect-profiles" Dec 09 15:31:18 crc kubenswrapper[4555]: I1209 15:31:18.265869 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8b7f6/must-gather-sp9dk" Dec 09 15:31:18 crc kubenswrapper[4555]: I1209 15:31:18.267650 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8b7f6"/"kube-root-ca.crt" Dec 09 15:31:18 crc kubenswrapper[4555]: I1209 15:31:18.268535 4555 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8b7f6"/"openshift-service-ca.crt" Dec 09 15:31:18 crc kubenswrapper[4555]: I1209 15:31:18.284897 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8b7f6/must-gather-sp9dk"] Dec 09 15:31:18 crc kubenswrapper[4555]: I1209 15:31:18.333183 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t958x\" (UniqueName: \"kubernetes.io/projected/b5f81540-f9fa-4ce5-9f64-2569567be218-kube-api-access-t958x\") pod \"must-gather-sp9dk\" (UID: \"b5f81540-f9fa-4ce5-9f64-2569567be218\") " pod="openshift-must-gather-8b7f6/must-gather-sp9dk" Dec 09 15:31:18 crc kubenswrapper[4555]: I1209 15:31:18.333215 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b5f81540-f9fa-4ce5-9f64-2569567be218-must-gather-output\") pod \"must-gather-sp9dk\" (UID: \"b5f81540-f9fa-4ce5-9f64-2569567be218\") " pod="openshift-must-gather-8b7f6/must-gather-sp9dk" Dec 09 15:31:18 crc kubenswrapper[4555]: I1209 15:31:18.434458 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t958x\" (UniqueName: \"kubernetes.io/projected/b5f81540-f9fa-4ce5-9f64-2569567be218-kube-api-access-t958x\") pod \"must-gather-sp9dk\" (UID: \"b5f81540-f9fa-4ce5-9f64-2569567be218\") " pod="openshift-must-gather-8b7f6/must-gather-sp9dk" Dec 09 15:31:18 crc kubenswrapper[4555]: I1209 15:31:18.434501 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b5f81540-f9fa-4ce5-9f64-2569567be218-must-gather-output\") pod \"must-gather-sp9dk\" (UID: \"b5f81540-f9fa-4ce5-9f64-2569567be218\") " pod="openshift-must-gather-8b7f6/must-gather-sp9dk" Dec 09 15:31:18 crc kubenswrapper[4555]: I1209 15:31:18.434882 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b5f81540-f9fa-4ce5-9f64-2569567be218-must-gather-output\") pod \"must-gather-sp9dk\" (UID: \"b5f81540-f9fa-4ce5-9f64-2569567be218\") " pod="openshift-must-gather-8b7f6/must-gather-sp9dk" Dec 09 15:31:18 crc kubenswrapper[4555]: I1209 15:31:18.458698 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t958x\" (UniqueName: \"kubernetes.io/projected/b5f81540-f9fa-4ce5-9f64-2569567be218-kube-api-access-t958x\") pod \"must-gather-sp9dk\" (UID: \"b5f81540-f9fa-4ce5-9f64-2569567be218\") " pod="openshift-must-gather-8b7f6/must-gather-sp9dk" Dec 09 15:31:18 crc kubenswrapper[4555]: I1209 15:31:18.579695 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8b7f6/must-gather-sp9dk" Dec 09 15:31:18 crc kubenswrapper[4555]: I1209 15:31:18.930840 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8b7f6/must-gather-sp9dk"] Dec 09 15:31:19 crc kubenswrapper[4555]: I1209 15:31:19.563527 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8b7f6/must-gather-sp9dk" event={"ID":"b5f81540-f9fa-4ce5-9f64-2569567be218","Type":"ContainerStarted","Data":"c157636b0fcb98662f604e5f90a16e258b73d901bd42d4cc177e5a7d9ba953f8"} Dec 09 15:31:24 crc kubenswrapper[4555]: I1209 15:31:24.599395 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8b7f6/must-gather-sp9dk" event={"ID":"b5f81540-f9fa-4ce5-9f64-2569567be218","Type":"ContainerStarted","Data":"915c2d43e943b70c7daed29490b0c0e013059eb5c006f36b492b68ed32c21a5f"} Dec 09 15:31:25 crc kubenswrapper[4555]: I1209 15:31:25.531789 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:31:25 crc kubenswrapper[4555]: I1209 15:31:25.532006 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:31:25 crc kubenswrapper[4555]: I1209 15:31:25.606408 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8b7f6/must-gather-sp9dk" event={"ID":"b5f81540-f9fa-4ce5-9f64-2569567be218","Type":"ContainerStarted","Data":"ff6e08d043a6c1546e83dde469b6fa814c44b37be05b7b4b1db29f0e82679da2"} Dec 09 15:31:25 crc kubenswrapper[4555]: I1209 15:31:25.618803 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8b7f6/must-gather-sp9dk" podStartSLOduration=2.101759902 podStartE2EDuration="7.618783615s" podCreationTimestamp="2025-12-09 15:31:18 +0000 UTC" firstStartedPulling="2025-12-09 15:31:18.935386117 +0000 UTC m=+1989.072098836" lastFinishedPulling="2025-12-09 15:31:24.452409832 +0000 UTC m=+1994.589122549" observedRunningTime="2025-12-09 15:31:25.617143653 +0000 UTC m=+1995.753856371" watchObservedRunningTime="2025-12-09 15:31:25.618783615 +0000 UTC m=+1995.755496333" Dec 09 15:31:30 crc kubenswrapper[4555]: I1209 15:31:30.489456 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rx67c"] Dec 09 15:31:30 crc kubenswrapper[4555]: I1209 15:31:30.491083 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rx67c" Dec 09 15:31:30 crc kubenswrapper[4555]: I1209 15:31:30.498710 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rx67c"] Dec 09 15:31:30 crc kubenswrapper[4555]: I1209 15:31:30.514067 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7606983e-d511-40d8-ac15-d41f98194c91-catalog-content\") pod \"community-operators-rx67c\" (UID: \"7606983e-d511-40d8-ac15-d41f98194c91\") " pod="openshift-marketplace/community-operators-rx67c" Dec 09 15:31:30 crc kubenswrapper[4555]: I1209 15:31:30.514134 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2vrv\" (UniqueName: \"kubernetes.io/projected/7606983e-d511-40d8-ac15-d41f98194c91-kube-api-access-z2vrv\") pod \"community-operators-rx67c\" (UID: \"7606983e-d511-40d8-ac15-d41f98194c91\") " pod="openshift-marketplace/community-operators-rx67c" Dec 09 15:31:30 crc kubenswrapper[4555]: I1209 15:31:30.514318 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7606983e-d511-40d8-ac15-d41f98194c91-utilities\") pod \"community-operators-rx67c\" (UID: \"7606983e-d511-40d8-ac15-d41f98194c91\") " pod="openshift-marketplace/community-operators-rx67c" Dec 09 15:31:30 crc kubenswrapper[4555]: I1209 15:31:30.615527 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7606983e-d511-40d8-ac15-d41f98194c91-utilities\") pod \"community-operators-rx67c\" (UID: \"7606983e-d511-40d8-ac15-d41f98194c91\") " pod="openshift-marketplace/community-operators-rx67c" Dec 09 15:31:30 crc kubenswrapper[4555]: I1209 15:31:30.615634 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7606983e-d511-40d8-ac15-d41f98194c91-catalog-content\") pod \"community-operators-rx67c\" (UID: \"7606983e-d511-40d8-ac15-d41f98194c91\") " pod="openshift-marketplace/community-operators-rx67c" Dec 09 15:31:30 crc kubenswrapper[4555]: I1209 15:31:30.615682 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2vrv\" (UniqueName: \"kubernetes.io/projected/7606983e-d511-40d8-ac15-d41f98194c91-kube-api-access-z2vrv\") pod \"community-operators-rx67c\" (UID: \"7606983e-d511-40d8-ac15-d41f98194c91\") " pod="openshift-marketplace/community-operators-rx67c" Dec 09 15:31:30 crc kubenswrapper[4555]: I1209 15:31:30.615937 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7606983e-d511-40d8-ac15-d41f98194c91-utilities\") pod \"community-operators-rx67c\" (UID: \"7606983e-d511-40d8-ac15-d41f98194c91\") " pod="openshift-marketplace/community-operators-rx67c" Dec 09 15:31:30 crc kubenswrapper[4555]: I1209 15:31:30.615999 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7606983e-d511-40d8-ac15-d41f98194c91-catalog-content\") pod \"community-operators-rx67c\" (UID: \"7606983e-d511-40d8-ac15-d41f98194c91\") " pod="openshift-marketplace/community-operators-rx67c" Dec 09 15:31:30 crc kubenswrapper[4555]: I1209 15:31:30.635149 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2vrv\" (UniqueName: \"kubernetes.io/projected/7606983e-d511-40d8-ac15-d41f98194c91-kube-api-access-z2vrv\") pod \"community-operators-rx67c\" (UID: \"7606983e-d511-40d8-ac15-d41f98194c91\") " pod="openshift-marketplace/community-operators-rx67c" Dec 09 15:31:30 crc kubenswrapper[4555]: I1209 15:31:30.686010 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sbgs2"] Dec 09 15:31:30 crc kubenswrapper[4555]: I1209 15:31:30.687337 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbgs2" Dec 09 15:31:30 crc kubenswrapper[4555]: I1209 15:31:30.697100 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbgs2"] Dec 09 15:31:30 crc kubenswrapper[4555]: I1209 15:31:30.717507 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj7hv\" (UniqueName: \"kubernetes.io/projected/65ef008b-d6cf-4632-9e29-c4c60f31cae6-kube-api-access-dj7hv\") pod \"redhat-marketplace-sbgs2\" (UID: \"65ef008b-d6cf-4632-9e29-c4c60f31cae6\") " pod="openshift-marketplace/redhat-marketplace-sbgs2" Dec 09 15:31:30 crc kubenswrapper[4555]: I1209 15:31:30.717598 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65ef008b-d6cf-4632-9e29-c4c60f31cae6-utilities\") pod \"redhat-marketplace-sbgs2\" (UID: \"65ef008b-d6cf-4632-9e29-c4c60f31cae6\") " pod="openshift-marketplace/redhat-marketplace-sbgs2" Dec 09 15:31:30 crc kubenswrapper[4555]: I1209 15:31:30.717635 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65ef008b-d6cf-4632-9e29-c4c60f31cae6-catalog-content\") pod \"redhat-marketplace-sbgs2\" (UID: \"65ef008b-d6cf-4632-9e29-c4c60f31cae6\") " pod="openshift-marketplace/redhat-marketplace-sbgs2" Dec 09 15:31:30 crc kubenswrapper[4555]: I1209 15:31:30.804717 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rx67c" Dec 09 15:31:30 crc kubenswrapper[4555]: I1209 15:31:30.818931 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj7hv\" (UniqueName: \"kubernetes.io/projected/65ef008b-d6cf-4632-9e29-c4c60f31cae6-kube-api-access-dj7hv\") pod \"redhat-marketplace-sbgs2\" (UID: \"65ef008b-d6cf-4632-9e29-c4c60f31cae6\") " pod="openshift-marketplace/redhat-marketplace-sbgs2" Dec 09 15:31:30 crc kubenswrapper[4555]: I1209 15:31:30.818995 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65ef008b-d6cf-4632-9e29-c4c60f31cae6-utilities\") pod \"redhat-marketplace-sbgs2\" (UID: \"65ef008b-d6cf-4632-9e29-c4c60f31cae6\") " pod="openshift-marketplace/redhat-marketplace-sbgs2" Dec 09 15:31:30 crc kubenswrapper[4555]: I1209 15:31:30.819020 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65ef008b-d6cf-4632-9e29-c4c60f31cae6-catalog-content\") pod \"redhat-marketplace-sbgs2\" (UID: \"65ef008b-d6cf-4632-9e29-c4c60f31cae6\") " pod="openshift-marketplace/redhat-marketplace-sbgs2" Dec 09 15:31:30 crc kubenswrapper[4555]: I1209 15:31:30.819446 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65ef008b-d6cf-4632-9e29-c4c60f31cae6-catalog-content\") pod \"redhat-marketplace-sbgs2\" (UID: \"65ef008b-d6cf-4632-9e29-c4c60f31cae6\") " pod="openshift-marketplace/redhat-marketplace-sbgs2" Dec 09 15:31:30 crc kubenswrapper[4555]: I1209 15:31:30.819884 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65ef008b-d6cf-4632-9e29-c4c60f31cae6-utilities\") pod \"redhat-marketplace-sbgs2\" (UID: \"65ef008b-d6cf-4632-9e29-c4c60f31cae6\") " pod="openshift-marketplace/redhat-marketplace-sbgs2" Dec 09 15:31:30 crc kubenswrapper[4555]: I1209 15:31:30.832473 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj7hv\" (UniqueName: \"kubernetes.io/projected/65ef008b-d6cf-4632-9e29-c4c60f31cae6-kube-api-access-dj7hv\") pod \"redhat-marketplace-sbgs2\" (UID: \"65ef008b-d6cf-4632-9e29-c4c60f31cae6\") " pod="openshift-marketplace/redhat-marketplace-sbgs2" Dec 09 15:31:31 crc kubenswrapper[4555]: I1209 15:31:31.008684 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbgs2" Dec 09 15:31:31 crc kubenswrapper[4555]: I1209 15:31:31.189949 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rx67c"] Dec 09 15:31:31 crc kubenswrapper[4555]: W1209 15:31:31.195395 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7606983e_d511_40d8_ac15_d41f98194c91.slice/crio-8ca42e7419d509efdc96dede185ad773ca3aee64651598a71c3cc95f52bd6634 WatchSource:0}: Error finding container 8ca42e7419d509efdc96dede185ad773ca3aee64651598a71c3cc95f52bd6634: Status 404 returned error can't find the container with id 8ca42e7419d509efdc96dede185ad773ca3aee64651598a71c3cc95f52bd6634 Dec 09 15:31:31 crc kubenswrapper[4555]: I1209 15:31:31.379253 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbgs2"] Dec 09 15:31:31 crc kubenswrapper[4555]: W1209 15:31:31.392583 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65ef008b_d6cf_4632_9e29_c4c60f31cae6.slice/crio-0835190c6561474aae332457b80a8a5397a22eda010385366ed6bdb4ccf6c239 WatchSource:0}: Error finding container 0835190c6561474aae332457b80a8a5397a22eda010385366ed6bdb4ccf6c239: Status 404 returned error can't find the container with id 0835190c6561474aae332457b80a8a5397a22eda010385366ed6bdb4ccf6c239 Dec 09 15:31:31 crc kubenswrapper[4555]: I1209 15:31:31.645010 4555 generic.go:334] "Generic (PLEG): container finished" podID="7606983e-d511-40d8-ac15-d41f98194c91" containerID="7e98c8f9b999c6931a7bb89a8561fe199658bd1e8b805e4a5f0b293c25c6b4e6" exitCode=0 Dec 09 15:31:31 crc kubenswrapper[4555]: I1209 15:31:31.645050 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rx67c" event={"ID":"7606983e-d511-40d8-ac15-d41f98194c91","Type":"ContainerDied","Data":"7e98c8f9b999c6931a7bb89a8561fe199658bd1e8b805e4a5f0b293c25c6b4e6"} Dec 09 15:31:31 crc kubenswrapper[4555]: I1209 15:31:31.645238 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rx67c" event={"ID":"7606983e-d511-40d8-ac15-d41f98194c91","Type":"ContainerStarted","Data":"8ca42e7419d509efdc96dede185ad773ca3aee64651598a71c3cc95f52bd6634"} Dec 09 15:31:31 crc kubenswrapper[4555]: I1209 15:31:31.646892 4555 generic.go:334] "Generic (PLEG): container finished" podID="65ef008b-d6cf-4632-9e29-c4c60f31cae6" containerID="f8aba5d46ef5b1be7c3db6430ed4304c619e2bc945d1633bef8be26fe232bc25" exitCode=0 Dec 09 15:31:31 crc kubenswrapper[4555]: I1209 15:31:31.646916 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbgs2" event={"ID":"65ef008b-d6cf-4632-9e29-c4c60f31cae6","Type":"ContainerDied","Data":"f8aba5d46ef5b1be7c3db6430ed4304c619e2bc945d1633bef8be26fe232bc25"} Dec 09 15:31:31 crc kubenswrapper[4555]: I1209 15:31:31.646942 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbgs2" event={"ID":"65ef008b-d6cf-4632-9e29-c4c60f31cae6","Type":"ContainerStarted","Data":"0835190c6561474aae332457b80a8a5397a22eda010385366ed6bdb4ccf6c239"} Dec 09 15:31:32 crc kubenswrapper[4555]: I1209 15:31:32.653936 4555 generic.go:334] "Generic (PLEG): container finished" podID="7606983e-d511-40d8-ac15-d41f98194c91" containerID="7b90d86369df271d29230d6fbd53662afdb879f389fd2788a1c0d88409511bcd" exitCode=0 Dec 09 15:31:32 crc kubenswrapper[4555]: I1209 15:31:32.654004 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rx67c" event={"ID":"7606983e-d511-40d8-ac15-d41f98194c91","Type":"ContainerDied","Data":"7b90d86369df271d29230d6fbd53662afdb879f389fd2788a1c0d88409511bcd"} Dec 09 15:31:32 crc kubenswrapper[4555]: I1209 15:31:32.658190 4555 generic.go:334] "Generic (PLEG): container finished" podID="65ef008b-d6cf-4632-9e29-c4c60f31cae6" containerID="d2c616c76f6efbf7115a55fdfdd4d15f230dbb2e42bcc3854303fa95b61a648c" exitCode=0 Dec 09 15:31:32 crc kubenswrapper[4555]: I1209 15:31:32.658229 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbgs2" event={"ID":"65ef008b-d6cf-4632-9e29-c4c60f31cae6","Type":"ContainerDied","Data":"d2c616c76f6efbf7115a55fdfdd4d15f230dbb2e42bcc3854303fa95b61a648c"} Dec 09 15:31:33 crc kubenswrapper[4555]: I1209 15:31:33.666920 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbgs2" event={"ID":"65ef008b-d6cf-4632-9e29-c4c60f31cae6","Type":"ContainerStarted","Data":"34bbb6b6b10fc40ceba17ce35f53b65eceba1fa4877be15a59bdeb3a436403c0"} Dec 09 15:31:33 crc kubenswrapper[4555]: I1209 15:31:33.668648 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rx67c" event={"ID":"7606983e-d511-40d8-ac15-d41f98194c91","Type":"ContainerStarted","Data":"2a41c0da96394b6fc07f4da1084111d5e0cd0dcd47b6c15ee286e23dbd29375f"} Dec 09 15:31:33 crc kubenswrapper[4555]: I1209 15:31:33.681140 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sbgs2" podStartSLOduration=2.19541649 podStartE2EDuration="3.681126169s" podCreationTimestamp="2025-12-09 15:31:30 +0000 UTC" firstStartedPulling="2025-12-09 15:31:31.648137169 +0000 UTC m=+2001.784849887" lastFinishedPulling="2025-12-09 15:31:33.133846849 +0000 UTC m=+2003.270559566" observedRunningTime="2025-12-09 15:31:33.680974112 +0000 UTC m=+2003.817686819" watchObservedRunningTime="2025-12-09 15:31:33.681126169 +0000 UTC m=+2003.817838887" Dec 09 15:31:33 crc kubenswrapper[4555]: I1209 15:31:33.699660 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rx67c" podStartSLOduration=2.164762987 podStartE2EDuration="3.699641726s" podCreationTimestamp="2025-12-09 15:31:30 +0000 UTC" firstStartedPulling="2025-12-09 15:31:31.646005771 +0000 UTC m=+2001.782718488" lastFinishedPulling="2025-12-09 15:31:33.180884509 +0000 UTC m=+2003.317597227" observedRunningTime="2025-12-09 15:31:33.697957911 +0000 UTC m=+2003.834670630" watchObservedRunningTime="2025-12-09 15:31:33.699641726 +0000 UTC m=+2003.836354444" Dec 09 15:31:40 crc kubenswrapper[4555]: I1209 15:31:40.805852 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rx67c" Dec 09 15:31:40 crc kubenswrapper[4555]: I1209 15:31:40.806392 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rx67c" Dec 09 15:31:40 crc kubenswrapper[4555]: I1209 15:31:40.835983 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rx67c" Dec 09 15:31:41 crc kubenswrapper[4555]: I1209 15:31:41.009683 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sbgs2" Dec 09 15:31:41 crc kubenswrapper[4555]: I1209 15:31:41.009898 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sbgs2" Dec 09 15:31:41 crc kubenswrapper[4555]: I1209 15:31:41.044373 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sbgs2" Dec 09 15:31:41 crc kubenswrapper[4555]: I1209 15:31:41.748358 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rx67c" Dec 09 15:31:41 crc kubenswrapper[4555]: I1209 15:31:41.748621 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sbgs2" Dec 09 15:31:42 crc kubenswrapper[4555]: I1209 15:31:42.679566 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rx67c"] Dec 09 15:31:42 crc kubenswrapper[4555]: I1209 15:31:42.879167 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbgs2"] Dec 09 15:31:43 crc kubenswrapper[4555]: I1209 15:31:43.728547 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rx67c" podUID="7606983e-d511-40d8-ac15-d41f98194c91" containerName="registry-server" containerID="cri-o://2a41c0da96394b6fc07f4da1084111d5e0cd0dcd47b6c15ee286e23dbd29375f" gracePeriod=2 Dec 09 15:31:44 crc kubenswrapper[4555]: I1209 15:31:44.071202 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rx67c" Dec 09 15:31:44 crc kubenswrapper[4555]: I1209 15:31:44.214486 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7606983e-d511-40d8-ac15-d41f98194c91-catalog-content\") pod \"7606983e-d511-40d8-ac15-d41f98194c91\" (UID: \"7606983e-d511-40d8-ac15-d41f98194c91\") " Dec 09 15:31:44 crc kubenswrapper[4555]: I1209 15:31:44.214880 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7606983e-d511-40d8-ac15-d41f98194c91-utilities\") pod \"7606983e-d511-40d8-ac15-d41f98194c91\" (UID: \"7606983e-d511-40d8-ac15-d41f98194c91\") " Dec 09 15:31:44 crc kubenswrapper[4555]: I1209 15:31:44.214946 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2vrv\" (UniqueName: \"kubernetes.io/projected/7606983e-d511-40d8-ac15-d41f98194c91-kube-api-access-z2vrv\") pod \"7606983e-d511-40d8-ac15-d41f98194c91\" (UID: \"7606983e-d511-40d8-ac15-d41f98194c91\") " Dec 09 15:31:44 crc kubenswrapper[4555]: I1209 15:31:44.215502 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7606983e-d511-40d8-ac15-d41f98194c91-utilities" (OuterVolumeSpecName: "utilities") pod "7606983e-d511-40d8-ac15-d41f98194c91" (UID: "7606983e-d511-40d8-ac15-d41f98194c91"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:31:44 crc kubenswrapper[4555]: I1209 15:31:44.223456 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7606983e-d511-40d8-ac15-d41f98194c91-kube-api-access-z2vrv" (OuterVolumeSpecName: "kube-api-access-z2vrv") pod "7606983e-d511-40d8-ac15-d41f98194c91" (UID: "7606983e-d511-40d8-ac15-d41f98194c91"). InnerVolumeSpecName "kube-api-access-z2vrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:31:44 crc kubenswrapper[4555]: I1209 15:31:44.248399 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7606983e-d511-40d8-ac15-d41f98194c91-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7606983e-d511-40d8-ac15-d41f98194c91" (UID: "7606983e-d511-40d8-ac15-d41f98194c91"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:31:44 crc kubenswrapper[4555]: I1209 15:31:44.316662 4555 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7606983e-d511-40d8-ac15-d41f98194c91-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:31:44 crc kubenswrapper[4555]: I1209 15:31:44.316693 4555 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7606983e-d511-40d8-ac15-d41f98194c91-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:31:44 crc kubenswrapper[4555]: I1209 15:31:44.316705 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2vrv\" (UniqueName: \"kubernetes.io/projected/7606983e-d511-40d8-ac15-d41f98194c91-kube-api-access-z2vrv\") on node \"crc\" DevicePath \"\"" Dec 09 15:31:44 crc kubenswrapper[4555]: I1209 15:31:44.735759 4555 generic.go:334] "Generic (PLEG): container finished" podID="7606983e-d511-40d8-ac15-d41f98194c91" containerID="2a41c0da96394b6fc07f4da1084111d5e0cd0dcd47b6c15ee286e23dbd29375f" exitCode=0 Dec 09 15:31:44 crc kubenswrapper[4555]: I1209 15:31:44.735799 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rx67c" Dec 09 15:31:44 crc kubenswrapper[4555]: I1209 15:31:44.735839 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rx67c" event={"ID":"7606983e-d511-40d8-ac15-d41f98194c91","Type":"ContainerDied","Data":"2a41c0da96394b6fc07f4da1084111d5e0cd0dcd47b6c15ee286e23dbd29375f"} Dec 09 15:31:44 crc kubenswrapper[4555]: I1209 15:31:44.735881 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rx67c" event={"ID":"7606983e-d511-40d8-ac15-d41f98194c91","Type":"ContainerDied","Data":"8ca42e7419d509efdc96dede185ad773ca3aee64651598a71c3cc95f52bd6634"} Dec 09 15:31:44 crc kubenswrapper[4555]: I1209 15:31:44.735898 4555 scope.go:117] "RemoveContainer" containerID="2a41c0da96394b6fc07f4da1084111d5e0cd0dcd47b6c15ee286e23dbd29375f" Dec 09 15:31:44 crc kubenswrapper[4555]: I1209 15:31:44.736152 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sbgs2" podUID="65ef008b-d6cf-4632-9e29-c4c60f31cae6" containerName="registry-server" containerID="cri-o://34bbb6b6b10fc40ceba17ce35f53b65eceba1fa4877be15a59bdeb3a436403c0" gracePeriod=2 Dec 09 15:31:44 crc kubenswrapper[4555]: I1209 15:31:44.752987 4555 scope.go:117] "RemoveContainer" containerID="7b90d86369df271d29230d6fbd53662afdb879f389fd2788a1c0d88409511bcd" Dec 09 15:31:44 crc kubenswrapper[4555]: I1209 15:31:44.756804 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rx67c"] Dec 09 15:31:44 crc kubenswrapper[4555]: I1209 15:31:44.761909 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rx67c"] Dec 09 15:31:44 crc kubenswrapper[4555]: I1209 15:31:44.823005 4555 scope.go:117] "RemoveContainer" containerID="7e98c8f9b999c6931a7bb89a8561fe199658bd1e8b805e4a5f0b293c25c6b4e6" Dec 09 15:31:44 crc kubenswrapper[4555]: I1209 15:31:44.842922 4555 scope.go:117] "RemoveContainer" containerID="2a41c0da96394b6fc07f4da1084111d5e0cd0dcd47b6c15ee286e23dbd29375f" Dec 09 15:31:44 crc kubenswrapper[4555]: E1209 15:31:44.844565 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a41c0da96394b6fc07f4da1084111d5e0cd0dcd47b6c15ee286e23dbd29375f\": container with ID starting with 2a41c0da96394b6fc07f4da1084111d5e0cd0dcd47b6c15ee286e23dbd29375f not found: ID does not exist" containerID="2a41c0da96394b6fc07f4da1084111d5e0cd0dcd47b6c15ee286e23dbd29375f" Dec 09 15:31:44 crc kubenswrapper[4555]: I1209 15:31:44.844622 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a41c0da96394b6fc07f4da1084111d5e0cd0dcd47b6c15ee286e23dbd29375f"} err="failed to get container status \"2a41c0da96394b6fc07f4da1084111d5e0cd0dcd47b6c15ee286e23dbd29375f\": rpc error: code = NotFound desc = could not find container \"2a41c0da96394b6fc07f4da1084111d5e0cd0dcd47b6c15ee286e23dbd29375f\": container with ID starting with 2a41c0da96394b6fc07f4da1084111d5e0cd0dcd47b6c15ee286e23dbd29375f not found: ID does not exist" Dec 09 15:31:44 crc kubenswrapper[4555]: I1209 15:31:44.844646 4555 scope.go:117] "RemoveContainer" containerID="7b90d86369df271d29230d6fbd53662afdb879f389fd2788a1c0d88409511bcd" Dec 09 15:31:44 crc kubenswrapper[4555]: E1209 15:31:44.844920 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b90d86369df271d29230d6fbd53662afdb879f389fd2788a1c0d88409511bcd\": container with ID starting with 7b90d86369df271d29230d6fbd53662afdb879f389fd2788a1c0d88409511bcd not found: ID does not exist" containerID="7b90d86369df271d29230d6fbd53662afdb879f389fd2788a1c0d88409511bcd" Dec 09 15:31:44 crc kubenswrapper[4555]: I1209 15:31:44.844964 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b90d86369df271d29230d6fbd53662afdb879f389fd2788a1c0d88409511bcd"} err="failed to get container status \"7b90d86369df271d29230d6fbd53662afdb879f389fd2788a1c0d88409511bcd\": rpc error: code = NotFound desc = could not find container \"7b90d86369df271d29230d6fbd53662afdb879f389fd2788a1c0d88409511bcd\": container with ID starting with 7b90d86369df271d29230d6fbd53662afdb879f389fd2788a1c0d88409511bcd not found: ID does not exist" Dec 09 15:31:44 crc kubenswrapper[4555]: I1209 15:31:44.844982 4555 scope.go:117] "RemoveContainer" containerID="7e98c8f9b999c6931a7bb89a8561fe199658bd1e8b805e4a5f0b293c25c6b4e6" Dec 09 15:31:44 crc kubenswrapper[4555]: E1209 15:31:44.845369 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e98c8f9b999c6931a7bb89a8561fe199658bd1e8b805e4a5f0b293c25c6b4e6\": container with ID starting with 7e98c8f9b999c6931a7bb89a8561fe199658bd1e8b805e4a5f0b293c25c6b4e6 not found: ID does not exist" containerID="7e98c8f9b999c6931a7bb89a8561fe199658bd1e8b805e4a5f0b293c25c6b4e6" Dec 09 15:31:44 crc kubenswrapper[4555]: I1209 15:31:44.845407 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e98c8f9b999c6931a7bb89a8561fe199658bd1e8b805e4a5f0b293c25c6b4e6"} err="failed to get container status \"7e98c8f9b999c6931a7bb89a8561fe199658bd1e8b805e4a5f0b293c25c6b4e6\": rpc error: code = NotFound desc = could not find container \"7e98c8f9b999c6931a7bb89a8561fe199658bd1e8b805e4a5f0b293c25c6b4e6\": container with ID starting with 7e98c8f9b999c6931a7bb89a8561fe199658bd1e8b805e4a5f0b293c25c6b4e6 not found: ID does not exist" Dec 09 15:31:45 crc kubenswrapper[4555]: I1209 15:31:45.081120 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbgs2" Dec 09 15:31:45 crc kubenswrapper[4555]: I1209 15:31:45.125050 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65ef008b-d6cf-4632-9e29-c4c60f31cae6-catalog-content\") pod \"65ef008b-d6cf-4632-9e29-c4c60f31cae6\" (UID: \"65ef008b-d6cf-4632-9e29-c4c60f31cae6\") " Dec 09 15:31:45 crc kubenswrapper[4555]: I1209 15:31:45.125107 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65ef008b-d6cf-4632-9e29-c4c60f31cae6-utilities\") pod \"65ef008b-d6cf-4632-9e29-c4c60f31cae6\" (UID: \"65ef008b-d6cf-4632-9e29-c4c60f31cae6\") " Dec 09 15:31:45 crc kubenswrapper[4555]: I1209 15:31:45.125157 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj7hv\" (UniqueName: \"kubernetes.io/projected/65ef008b-d6cf-4632-9e29-c4c60f31cae6-kube-api-access-dj7hv\") pod \"65ef008b-d6cf-4632-9e29-c4c60f31cae6\" (UID: \"65ef008b-d6cf-4632-9e29-c4c60f31cae6\") " Dec 09 15:31:45 crc kubenswrapper[4555]: I1209 15:31:45.125729 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65ef008b-d6cf-4632-9e29-c4c60f31cae6-utilities" (OuterVolumeSpecName: "utilities") pod "65ef008b-d6cf-4632-9e29-c4c60f31cae6" (UID: "65ef008b-d6cf-4632-9e29-c4c60f31cae6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:31:45 crc kubenswrapper[4555]: I1209 15:31:45.130153 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65ef008b-d6cf-4632-9e29-c4c60f31cae6-kube-api-access-dj7hv" (OuterVolumeSpecName: "kube-api-access-dj7hv") pod "65ef008b-d6cf-4632-9e29-c4c60f31cae6" (UID: "65ef008b-d6cf-4632-9e29-c4c60f31cae6"). InnerVolumeSpecName "kube-api-access-dj7hv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:31:45 crc kubenswrapper[4555]: I1209 15:31:45.139168 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65ef008b-d6cf-4632-9e29-c4c60f31cae6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65ef008b-d6cf-4632-9e29-c4c60f31cae6" (UID: "65ef008b-d6cf-4632-9e29-c4c60f31cae6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:31:45 crc kubenswrapper[4555]: I1209 15:31:45.227825 4555 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65ef008b-d6cf-4632-9e29-c4c60f31cae6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:31:45 crc kubenswrapper[4555]: I1209 15:31:45.227847 4555 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65ef008b-d6cf-4632-9e29-c4c60f31cae6-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:31:45 crc kubenswrapper[4555]: I1209 15:31:45.227857 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj7hv\" (UniqueName: \"kubernetes.io/projected/65ef008b-d6cf-4632-9e29-c4c60f31cae6-kube-api-access-dj7hv\") on node \"crc\" DevicePath \"\"" Dec 09 15:31:45 crc kubenswrapper[4555]: I1209 15:31:45.744788 4555 generic.go:334] "Generic (PLEG): container finished" podID="65ef008b-d6cf-4632-9e29-c4c60f31cae6" containerID="34bbb6b6b10fc40ceba17ce35f53b65eceba1fa4877be15a59bdeb3a436403c0" exitCode=0 Dec 09 15:31:45 crc kubenswrapper[4555]: I1209 15:31:45.744829 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbgs2" event={"ID":"65ef008b-d6cf-4632-9e29-c4c60f31cae6","Type":"ContainerDied","Data":"34bbb6b6b10fc40ceba17ce35f53b65eceba1fa4877be15a59bdeb3a436403c0"} Dec 09 15:31:45 crc kubenswrapper[4555]: I1209 15:31:45.744854 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbgs2" event={"ID":"65ef008b-d6cf-4632-9e29-c4c60f31cae6","Type":"ContainerDied","Data":"0835190c6561474aae332457b80a8a5397a22eda010385366ed6bdb4ccf6c239"} Dec 09 15:31:45 crc kubenswrapper[4555]: I1209 15:31:45.744870 4555 scope.go:117] "RemoveContainer" containerID="34bbb6b6b10fc40ceba17ce35f53b65eceba1fa4877be15a59bdeb3a436403c0" Dec 09 15:31:45 crc kubenswrapper[4555]: I1209 15:31:45.744988 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbgs2" Dec 09 15:31:45 crc kubenswrapper[4555]: I1209 15:31:45.773179 4555 scope.go:117] "RemoveContainer" containerID="d2c616c76f6efbf7115a55fdfdd4d15f230dbb2e42bcc3854303fa95b61a648c" Dec 09 15:31:45 crc kubenswrapper[4555]: I1209 15:31:45.799372 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbgs2"] Dec 09 15:31:45 crc kubenswrapper[4555]: I1209 15:31:45.800419 4555 scope.go:117] "RemoveContainer" containerID="f8aba5d46ef5b1be7c3db6430ed4304c619e2bc945d1633bef8be26fe232bc25" Dec 09 15:31:45 crc kubenswrapper[4555]: I1209 15:31:45.804898 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbgs2"] Dec 09 15:31:45 crc kubenswrapper[4555]: I1209 15:31:45.818118 4555 scope.go:117] "RemoveContainer" containerID="34bbb6b6b10fc40ceba17ce35f53b65eceba1fa4877be15a59bdeb3a436403c0" Dec 09 15:31:45 crc kubenswrapper[4555]: E1209 15:31:45.818452 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34bbb6b6b10fc40ceba17ce35f53b65eceba1fa4877be15a59bdeb3a436403c0\": container with ID starting with 34bbb6b6b10fc40ceba17ce35f53b65eceba1fa4877be15a59bdeb3a436403c0 not found: ID does not exist" containerID="34bbb6b6b10fc40ceba17ce35f53b65eceba1fa4877be15a59bdeb3a436403c0" Dec 09 15:31:45 crc kubenswrapper[4555]: I1209 15:31:45.818488 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34bbb6b6b10fc40ceba17ce35f53b65eceba1fa4877be15a59bdeb3a436403c0"} err="failed to get container status \"34bbb6b6b10fc40ceba17ce35f53b65eceba1fa4877be15a59bdeb3a436403c0\": rpc error: code = NotFound desc = could not find container \"34bbb6b6b10fc40ceba17ce35f53b65eceba1fa4877be15a59bdeb3a436403c0\": container with ID starting with 34bbb6b6b10fc40ceba17ce35f53b65eceba1fa4877be15a59bdeb3a436403c0 not found: ID does not exist" Dec 09 15:31:45 crc kubenswrapper[4555]: I1209 15:31:45.818514 4555 scope.go:117] "RemoveContainer" containerID="d2c616c76f6efbf7115a55fdfdd4d15f230dbb2e42bcc3854303fa95b61a648c" Dec 09 15:31:45 crc kubenswrapper[4555]: E1209 15:31:45.818807 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2c616c76f6efbf7115a55fdfdd4d15f230dbb2e42bcc3854303fa95b61a648c\": container with ID starting with d2c616c76f6efbf7115a55fdfdd4d15f230dbb2e42bcc3854303fa95b61a648c not found: ID does not exist" containerID="d2c616c76f6efbf7115a55fdfdd4d15f230dbb2e42bcc3854303fa95b61a648c" Dec 09 15:31:45 crc kubenswrapper[4555]: I1209 15:31:45.818892 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2c616c76f6efbf7115a55fdfdd4d15f230dbb2e42bcc3854303fa95b61a648c"} err="failed to get container status \"d2c616c76f6efbf7115a55fdfdd4d15f230dbb2e42bcc3854303fa95b61a648c\": rpc error: code = NotFound desc = could not find container \"d2c616c76f6efbf7115a55fdfdd4d15f230dbb2e42bcc3854303fa95b61a648c\": container with ID starting with d2c616c76f6efbf7115a55fdfdd4d15f230dbb2e42bcc3854303fa95b61a648c not found: ID does not exist" Dec 09 15:31:45 crc kubenswrapper[4555]: I1209 15:31:45.818966 4555 scope.go:117] "RemoveContainer" containerID="f8aba5d46ef5b1be7c3db6430ed4304c619e2bc945d1633bef8be26fe232bc25" Dec 09 15:31:45 crc kubenswrapper[4555]: E1209 15:31:45.819231 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8aba5d46ef5b1be7c3db6430ed4304c619e2bc945d1633bef8be26fe232bc25\": container with ID starting with f8aba5d46ef5b1be7c3db6430ed4304c619e2bc945d1633bef8be26fe232bc25 not found: ID does not exist" containerID="f8aba5d46ef5b1be7c3db6430ed4304c619e2bc945d1633bef8be26fe232bc25" Dec 09 15:31:45 crc kubenswrapper[4555]: I1209 15:31:45.819333 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8aba5d46ef5b1be7c3db6430ed4304c619e2bc945d1633bef8be26fe232bc25"} err="failed to get container status \"f8aba5d46ef5b1be7c3db6430ed4304c619e2bc945d1633bef8be26fe232bc25\": rpc error: code = NotFound desc = could not find container \"f8aba5d46ef5b1be7c3db6430ed4304c619e2bc945d1633bef8be26fe232bc25\": container with ID starting with f8aba5d46ef5b1be7c3db6430ed4304c619e2bc945d1633bef8be26fe232bc25 not found: ID does not exist" Dec 09 15:31:46 crc kubenswrapper[4555]: I1209 15:31:46.167627 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65ef008b-d6cf-4632-9e29-c4c60f31cae6" path="/var/lib/kubelet/pods/65ef008b-d6cf-4632-9e29-c4c60f31cae6/volumes" Dec 09 15:31:46 crc kubenswrapper[4555]: I1209 15:31:46.168182 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7606983e-d511-40d8-ac15-d41f98194c91" path="/var/lib/kubelet/pods/7606983e-d511-40d8-ac15-d41f98194c91/volumes" Dec 09 15:31:54 crc kubenswrapper[4555]: I1209 15:31:54.769793 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-rsbss_dc25bf0a-f5e4-4f6d-9442-6b55a043f751/control-plane-machine-set-operator/0.log" Dec 09 15:31:54 crc kubenswrapper[4555]: I1209 15:31:54.887372 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5ncnf_c16a9d27-0e19-4e91-ad2b-0a5e3948e19c/kube-rbac-proxy/0.log" Dec 09 15:31:54 crc kubenswrapper[4555]: I1209 15:31:54.900566 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5ncnf_c16a9d27-0e19-4e91-ad2b-0a5e3948e19c/machine-api-operator/0.log" Dec 09 15:31:55 crc kubenswrapper[4555]: I1209 15:31:55.531396 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:31:55 crc kubenswrapper[4555]: I1209 15:31:55.531624 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:31:55 crc kubenswrapper[4555]: I1209 15:31:55.531660 4555 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" Dec 09 15:31:55 crc kubenswrapper[4555]: I1209 15:31:55.532258 4555 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e4bc645e4f69b0b3b14ca33644fa6151ef72f410fa63d62a5caa76a2af6d0c75"} pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 15:31:55 crc kubenswrapper[4555]: I1209 15:31:55.532329 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" containerID="cri-o://e4bc645e4f69b0b3b14ca33644fa6151ef72f410fa63d62a5caa76a2af6d0c75" gracePeriod=600 Dec 09 15:31:55 crc kubenswrapper[4555]: I1209 15:31:55.801794 4555 generic.go:334] "Generic (PLEG): container finished" podID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerID="e4bc645e4f69b0b3b14ca33644fa6151ef72f410fa63d62a5caa76a2af6d0c75" exitCode=0 Dec 09 15:31:55 crc kubenswrapper[4555]: I1209 15:31:55.801858 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" event={"ID":"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36","Type":"ContainerDied","Data":"e4bc645e4f69b0b3b14ca33644fa6151ef72f410fa63d62a5caa76a2af6d0c75"} Dec 09 15:31:55 crc kubenswrapper[4555]: I1209 15:31:55.802033 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" event={"ID":"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36","Type":"ContainerStarted","Data":"b9189c1bb7d2ed40234f5435fa027fe8f9d04f85cada8d567cbb9cf18212bea4"} Dec 09 15:31:55 crc kubenswrapper[4555]: I1209 15:31:55.802051 4555 scope.go:117] "RemoveContainer" containerID="167f103c5df266f675a71b91f42df76a88386126f453e6a8bc5bab7fa2aa1818" Dec 09 15:32:03 crc kubenswrapper[4555]: I1209 15:32:03.157783 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-lp6rb_c6c72476-b1dc-4dd0-a0af-1dbda507a483/cert-manager-controller/0.log" Dec 09 15:32:03 crc kubenswrapper[4555]: I1209 15:32:03.287203 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-gkkj9_ed698ce4-ff03-4c60-9f16-e9267f8450b1/cert-manager-cainjector/0.log" Dec 09 15:32:03 crc kubenswrapper[4555]: I1209 15:32:03.319795 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-l4ts5_4c6fc7d3-7d4a-45cb-bcfd-568d45abba41/cert-manager-webhook/0.log" Dec 09 15:32:11 crc kubenswrapper[4555]: I1209 15:32:11.632870 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-kcpqb_41bb2d5e-b40e-4088-9ffa-ac0234d7050c/nmstate-console-plugin/0.log" Dec 09 15:32:11 crc kubenswrapper[4555]: I1209 15:32:11.759022 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-t6lcd_96ba4ce4-cba2-47a3-b353-e8f3daedfa5a/nmstate-handler/0.log" Dec 09 15:32:11 crc kubenswrapper[4555]: I1209 15:32:11.799879 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-phn8q_41e8b8f7-9b28-46f9-b474-9d519d0e7875/kube-rbac-proxy/0.log" Dec 09 15:32:11 crc kubenswrapper[4555]: I1209 15:32:11.805742 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-phn8q_41e8b8f7-9b28-46f9-b474-9d519d0e7875/nmstate-metrics/0.log" Dec 09 15:32:11 crc kubenswrapper[4555]: I1209 15:32:11.951147 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-q5d4p_759b2d4d-b483-4fcf-8f2a-f5893d685d51/nmstate-operator/0.log" Dec 09 15:32:11 crc kubenswrapper[4555]: I1209 15:32:11.982764 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-pdmhs_06ff1c84-8959-4b54-9fea-12a6d8ed97d5/nmstate-webhook/0.log" Dec 09 15:32:20 crc kubenswrapper[4555]: I1209 15:32:20.351754 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-8b59b96d9-kp25m_f6cf7440-1934-4f8f-9e77-22f1d3315527/manager/0.log" Dec 09 15:32:20 crc kubenswrapper[4555]: I1209 15:32:20.356708 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-8b59b96d9-kp25m_f6cf7440-1934-4f8f-9e77-22f1d3315527/kube-rbac-proxy/0.log" Dec 09 15:32:29 crc kubenswrapper[4555]: I1209 15:32:29.076980 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-ff9846bd-jxn6k_193bdd06-24b0-4f99-9a9e-bdeeac71dd62/cluster-logging-operator/0.log" Dec 09 15:32:29 crc kubenswrapper[4555]: I1209 15:32:29.185425 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-qt8j8_8dcb0d25-363c-47cb-9173-49dad4a2fe5d/collector/0.log" Dec 09 15:32:29 crc kubenswrapper[4555]: I1209 15:32:29.214795 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_37512310-d9d3-46a8-aba6-7699f4ed2bec/loki-compactor/0.log" Dec 09 15:32:29 crc kubenswrapper[4555]: I1209 15:32:29.419355 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-76cc67bf56-tfvnb_84677826-310d-4103-8976-179fea4591d4/loki-distributor/0.log" Dec 09 15:32:29 crc kubenswrapper[4555]: I1209 15:32:29.422685 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-5c557c5777-cgcbj_2e9a80cf-bda4-4c6f-be38-028a906afc09/gateway/0.log" Dec 09 15:32:29 crc kubenswrapper[4555]: I1209 15:32:29.466007 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-5c557c5777-cgcbj_2e9a80cf-bda4-4c6f-be38-028a906afc09/opa/0.log" Dec 09 15:32:29 crc kubenswrapper[4555]: I1209 15:32:29.566247 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-5c557c5777-mvw9s_9733ac5a-b591-46d7-ab89-bc55453c8330/gateway/0.log" Dec 09 15:32:29 crc kubenswrapper[4555]: I1209 15:32:29.586758 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-5c557c5777-mvw9s_9733ac5a-b591-46d7-ab89-bc55453c8330/opa/0.log" Dec 09 15:32:29 crc kubenswrapper[4555]: I1209 15:32:29.703403 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_a95fb43d-d122-4136-8502-780c693b27c5/loki-index-gateway/0.log" Dec 09 15:32:29 crc kubenswrapper[4555]: I1209 15:32:29.750136 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_9f161ea6-77ce-41a8-8a32-92006cadcaa4/loki-ingester/0.log" Dec 09 15:32:29 crc kubenswrapper[4555]: I1209 15:32:29.833791 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-5895d59bb8-9ssm2_f2672ffe-a69b-4127-95b0-077fff7c2dae/loki-querier/0.log" Dec 09 15:32:29 crc kubenswrapper[4555]: I1209 15:32:29.871647 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-84558f7c9f-9vk9m_bbbfd9c8-69ae-4dbb-99b1-1a62073759b3/loki-query-frontend/0.log" Dec 09 15:32:38 crc kubenswrapper[4555]: I1209 15:32:38.906437 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-x79zw_3c14ab57-d872-4657-b6bc-30ab1752110a/kube-rbac-proxy/0.log" Dec 09 15:32:39 crc kubenswrapper[4555]: I1209 15:32:39.049920 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-x79zw_3c14ab57-d872-4657-b6bc-30ab1752110a/controller/0.log" Dec 09 15:32:39 crc kubenswrapper[4555]: I1209 15:32:39.104891 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkbk6_9a99fb67-9306-46ea-ab6e-cc58491f5772/cp-frr-files/0.log" Dec 09 15:32:39 crc kubenswrapper[4555]: I1209 15:32:39.208939 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkbk6_9a99fb67-9306-46ea-ab6e-cc58491f5772/cp-frr-files/0.log" Dec 09 15:32:39 crc kubenswrapper[4555]: I1209 15:32:39.208967 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkbk6_9a99fb67-9306-46ea-ab6e-cc58491f5772/cp-metrics/0.log" Dec 09 15:32:39 crc kubenswrapper[4555]: I1209 15:32:39.211513 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkbk6_9a99fb67-9306-46ea-ab6e-cc58491f5772/cp-reloader/0.log" Dec 09 15:32:39 crc kubenswrapper[4555]: I1209 15:32:39.262028 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkbk6_9a99fb67-9306-46ea-ab6e-cc58491f5772/cp-reloader/0.log" Dec 09 15:32:39 crc kubenswrapper[4555]: I1209 15:32:39.360776 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkbk6_9a99fb67-9306-46ea-ab6e-cc58491f5772/cp-frr-files/0.log" Dec 09 15:32:39 crc kubenswrapper[4555]: I1209 15:32:39.375203 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkbk6_9a99fb67-9306-46ea-ab6e-cc58491f5772/cp-metrics/0.log" Dec 09 15:32:39 crc kubenswrapper[4555]: I1209 15:32:39.386421 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkbk6_9a99fb67-9306-46ea-ab6e-cc58491f5772/cp-reloader/0.log" Dec 09 15:32:39 crc kubenswrapper[4555]: I1209 15:32:39.411025 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkbk6_9a99fb67-9306-46ea-ab6e-cc58491f5772/cp-metrics/0.log" Dec 09 15:32:39 crc kubenswrapper[4555]: I1209 15:32:39.510184 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkbk6_9a99fb67-9306-46ea-ab6e-cc58491f5772/cp-reloader/0.log" Dec 09 15:32:39 crc kubenswrapper[4555]: I1209 15:32:39.518961 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkbk6_9a99fb67-9306-46ea-ab6e-cc58491f5772/cp-metrics/0.log" Dec 09 15:32:39 crc kubenswrapper[4555]: I1209 15:32:39.532692 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkbk6_9a99fb67-9306-46ea-ab6e-cc58491f5772/cp-frr-files/0.log" Dec 09 15:32:39 crc kubenswrapper[4555]: I1209 15:32:39.536002 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkbk6_9a99fb67-9306-46ea-ab6e-cc58491f5772/controller/0.log" Dec 09 15:32:39 crc kubenswrapper[4555]: I1209 15:32:39.645562 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkbk6_9a99fb67-9306-46ea-ab6e-cc58491f5772/frr-metrics/0.log" Dec 09 15:32:39 crc kubenswrapper[4555]: I1209 15:32:39.657603 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkbk6_9a99fb67-9306-46ea-ab6e-cc58491f5772/kube-rbac-proxy/0.log" Dec 09 15:32:39 crc kubenswrapper[4555]: I1209 15:32:39.672663 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkbk6_9a99fb67-9306-46ea-ab6e-cc58491f5772/kube-rbac-proxy-frr/0.log" Dec 09 15:32:39 crc kubenswrapper[4555]: I1209 15:32:39.682883 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkbk6_9a99fb67-9306-46ea-ab6e-cc58491f5772/frr/0.log" Dec 09 15:32:39 crc kubenswrapper[4555]: I1209 15:32:39.790533 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-z7j4m_60ab11ad-6b93-44bb-855d-f913d995440a/frr-k8s-webhook-server/0.log" Dec 09 15:32:39 crc kubenswrapper[4555]: I1209 15:32:39.793797 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkbk6_9a99fb67-9306-46ea-ab6e-cc58491f5772/reloader/0.log" Dec 09 15:32:39 crc kubenswrapper[4555]: I1209 15:32:39.916321 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-67986b9b4b-shcnp_a1efc90d-b721-4bc6-a9e3-74280e8f541d/manager/0.log" Dec 09 15:32:39 crc kubenswrapper[4555]: I1209 15:32:39.938549 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-9b87bb679-nztx8_345488ea-f298-4caa-9128-0bcbca8581f0/webhook-server/0.log" Dec 09 15:32:40 crc kubenswrapper[4555]: I1209 15:32:40.044531 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-hn5bl_ed284bf2-6ecf-4453-8a50-b951e38f8129/kube-rbac-proxy/0.log" Dec 09 15:32:40 crc kubenswrapper[4555]: I1209 15:32:40.161439 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-hn5bl_ed284bf2-6ecf-4453-8a50-b951e38f8129/speaker/0.log" Dec 09 15:32:47 crc kubenswrapper[4555]: I1209 15:32:47.733122 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm_b596a9ab-1868-4254-994e-1a99ff17a68a/util/0.log" Dec 09 15:32:47 crc kubenswrapper[4555]: I1209 15:32:47.842465 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm_b596a9ab-1868-4254-994e-1a99ff17a68a/util/0.log" Dec 09 15:32:47 crc kubenswrapper[4555]: I1209 15:32:47.855156 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm_b596a9ab-1868-4254-994e-1a99ff17a68a/pull/0.log" Dec 09 15:32:47 crc kubenswrapper[4555]: I1209 15:32:47.862668 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm_b596a9ab-1868-4254-994e-1a99ff17a68a/pull/0.log" Dec 09 15:32:47 crc kubenswrapper[4555]: I1209 15:32:47.965576 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm_b596a9ab-1868-4254-994e-1a99ff17a68a/pull/0.log" Dec 09 15:32:47 crc kubenswrapper[4555]: I1209 15:32:47.970337 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm_b596a9ab-1868-4254-994e-1a99ff17a68a/util/0.log" Dec 09 15:32:47 crc kubenswrapper[4555]: I1209 15:32:47.981337 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb89zbbm_b596a9ab-1868-4254-994e-1a99ff17a68a/extract/0.log" Dec 09 15:32:48 crc kubenswrapper[4555]: I1209 15:32:48.088569 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2_7b68f60b-a6f6-4057-b219-fc436349e094/util/0.log" Dec 09 15:32:48 crc kubenswrapper[4555]: I1209 15:32:48.204256 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2_7b68f60b-a6f6-4057-b219-fc436349e094/util/0.log" Dec 09 15:32:48 crc kubenswrapper[4555]: I1209 15:32:48.225750 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2_7b68f60b-a6f6-4057-b219-fc436349e094/pull/0.log" Dec 09 15:32:48 crc kubenswrapper[4555]: I1209 15:32:48.230436 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2_7b68f60b-a6f6-4057-b219-fc436349e094/pull/0.log" Dec 09 15:32:48 crc kubenswrapper[4555]: I1209 15:32:48.349919 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2_7b68f60b-a6f6-4057-b219-fc436349e094/pull/0.log" Dec 09 15:32:48 crc kubenswrapper[4555]: I1209 15:32:48.350985 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2_7b68f60b-a6f6-4057-b219-fc436349e094/util/0.log" Dec 09 15:32:48 crc kubenswrapper[4555]: I1209 15:32:48.365295 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fk6tc2_7b68f60b-a6f6-4057-b219-fc436349e094/extract/0.log" Dec 09 15:32:48 crc kubenswrapper[4555]: I1209 15:32:48.476684 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n_cfc81043-e54b-4a52-ad3a-6e10344badc1/util/0.log" Dec 09 15:32:48 crc kubenswrapper[4555]: I1209 15:32:48.587987 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n_cfc81043-e54b-4a52-ad3a-6e10344badc1/util/0.log" Dec 09 15:32:48 crc kubenswrapper[4555]: I1209 15:32:48.598036 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n_cfc81043-e54b-4a52-ad3a-6e10344badc1/pull/0.log" Dec 09 15:32:48 crc kubenswrapper[4555]: I1209 15:32:48.600170 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n_cfc81043-e54b-4a52-ad3a-6e10344badc1/pull/0.log" Dec 09 15:32:48 crc kubenswrapper[4555]: I1209 15:32:48.711947 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n_cfc81043-e54b-4a52-ad3a-6e10344badc1/util/0.log" Dec 09 15:32:48 crc kubenswrapper[4555]: I1209 15:32:48.731055 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n_cfc81043-e54b-4a52-ad3a-6e10344badc1/extract/0.log" Dec 09 15:32:48 crc kubenswrapper[4555]: I1209 15:32:48.733960 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqr9n_cfc81043-e54b-4a52-ad3a-6e10344badc1/pull/0.log" Dec 09 15:32:48 crc kubenswrapper[4555]: I1209 15:32:48.827615 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm_98f2d54b-bda4-4632-9bbd-1f138d08c470/util/0.log" Dec 09 15:32:48 crc kubenswrapper[4555]: I1209 15:32:48.965123 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm_98f2d54b-bda4-4632-9bbd-1f138d08c470/util/0.log" Dec 09 15:32:49 crc kubenswrapper[4555]: I1209 15:32:49.002862 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm_98f2d54b-bda4-4632-9bbd-1f138d08c470/pull/0.log" Dec 09 15:32:49 crc kubenswrapper[4555]: I1209 15:32:49.046197 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm_98f2d54b-bda4-4632-9bbd-1f138d08c470/pull/0.log" Dec 09 15:32:49 crc kubenswrapper[4555]: I1209 15:32:49.148722 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm_98f2d54b-bda4-4632-9bbd-1f138d08c470/util/0.log" Dec 09 15:32:49 crc kubenswrapper[4555]: I1209 15:32:49.182091 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm_98f2d54b-bda4-4632-9bbd-1f138d08c470/extract/0.log" Dec 09 15:32:49 crc kubenswrapper[4555]: I1209 15:32:49.191018 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fjhnxm_98f2d54b-bda4-4632-9bbd-1f138d08c470/pull/0.log" Dec 09 15:32:49 crc kubenswrapper[4555]: I1209 15:32:49.279897 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4_59e2835f-791d-4fa3-b024-5d28b08c0bf1/util/0.log" Dec 09 15:32:49 crc kubenswrapper[4555]: I1209 15:32:49.410659 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4_59e2835f-791d-4fa3-b024-5d28b08c0bf1/pull/0.log" Dec 09 15:32:49 crc kubenswrapper[4555]: I1209 15:32:49.437728 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4_59e2835f-791d-4fa3-b024-5d28b08c0bf1/pull/0.log" Dec 09 15:32:49 crc kubenswrapper[4555]: I1209 15:32:49.443175 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4_59e2835f-791d-4fa3-b024-5d28b08c0bf1/util/0.log" Dec 09 15:32:49 crc kubenswrapper[4555]: I1209 15:32:49.548120 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4_59e2835f-791d-4fa3-b024-5d28b08c0bf1/util/0.log" Dec 09 15:32:49 crc kubenswrapper[4555]: I1209 15:32:49.551111 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4_59e2835f-791d-4fa3-b024-5d28b08c0bf1/extract/0.log" Dec 09 15:32:49 crc kubenswrapper[4555]: I1209 15:32:49.562885 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fszs4_59e2835f-791d-4fa3-b024-5d28b08c0bf1/pull/0.log" Dec 09 15:32:49 crc kubenswrapper[4555]: I1209 15:32:49.657747 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k9bn7_ca7b78bf-7f50-4515-98b8-f7581c0e90d1/extract-utilities/0.log" Dec 09 15:32:49 crc kubenswrapper[4555]: I1209 15:32:49.776538 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k9bn7_ca7b78bf-7f50-4515-98b8-f7581c0e90d1/extract-utilities/0.log" Dec 09 15:32:49 crc kubenswrapper[4555]: I1209 15:32:49.778566 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k9bn7_ca7b78bf-7f50-4515-98b8-f7581c0e90d1/extract-content/0.log" Dec 09 15:32:49 crc kubenswrapper[4555]: I1209 15:32:49.778630 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k9bn7_ca7b78bf-7f50-4515-98b8-f7581c0e90d1/extract-content/0.log" Dec 09 15:32:49 crc kubenswrapper[4555]: I1209 15:32:49.892853 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k9bn7_ca7b78bf-7f50-4515-98b8-f7581c0e90d1/extract-utilities/0.log" Dec 09 15:32:49 crc kubenswrapper[4555]: I1209 15:32:49.928462 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k9bn7_ca7b78bf-7f50-4515-98b8-f7581c0e90d1/extract-content/0.log" Dec 09 15:32:50 crc kubenswrapper[4555]: I1209 15:32:50.084404 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qqpmg_523fa32b-68ac-491d-8246-d57c80030415/extract-utilities/0.log" Dec 09 15:32:50 crc kubenswrapper[4555]: I1209 15:32:50.195637 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k9bn7_ca7b78bf-7f50-4515-98b8-f7581c0e90d1/registry-server/0.log" Dec 09 15:32:50 crc kubenswrapper[4555]: I1209 15:32:50.197964 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qqpmg_523fa32b-68ac-491d-8246-d57c80030415/extract-content/0.log" Dec 09 15:32:50 crc kubenswrapper[4555]: I1209 15:32:50.209356 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qqpmg_523fa32b-68ac-491d-8246-d57c80030415/extract-utilities/0.log" Dec 09 15:32:50 crc kubenswrapper[4555]: I1209 15:32:50.250570 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qqpmg_523fa32b-68ac-491d-8246-d57c80030415/extract-content/0.log" Dec 09 15:32:50 crc kubenswrapper[4555]: I1209 15:32:50.356845 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qqpmg_523fa32b-68ac-491d-8246-d57c80030415/extract-utilities/0.log" Dec 09 15:32:50 crc kubenswrapper[4555]: I1209 15:32:50.399808 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qqpmg_523fa32b-68ac-491d-8246-d57c80030415/extract-content/0.log" Dec 09 15:32:50 crc kubenswrapper[4555]: I1209 15:32:50.522995 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-pqkv4_9ef9ab0f-e5f9-40b4-b761-666bd4e1832d/marketplace-operator/0.log" Dec 09 15:32:50 crc kubenswrapper[4555]: I1209 15:32:50.581577 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jsb8t_4e51fd14-b652-4c2f-8919-bca6fe51cc06/extract-utilities/0.log" Dec 09 15:32:50 crc kubenswrapper[4555]: I1209 15:32:50.693973 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jsb8t_4e51fd14-b652-4c2f-8919-bca6fe51cc06/extract-utilities/0.log" Dec 09 15:32:50 crc kubenswrapper[4555]: I1209 15:32:50.703599 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qqpmg_523fa32b-68ac-491d-8246-d57c80030415/registry-server/0.log" Dec 09 15:32:50 crc kubenswrapper[4555]: I1209 15:32:50.715835 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jsb8t_4e51fd14-b652-4c2f-8919-bca6fe51cc06/extract-content/0.log" Dec 09 15:32:50 crc kubenswrapper[4555]: I1209 15:32:50.747532 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jsb8t_4e51fd14-b652-4c2f-8919-bca6fe51cc06/extract-content/0.log" Dec 09 15:32:50 crc kubenswrapper[4555]: I1209 15:32:50.847187 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jsb8t_4e51fd14-b652-4c2f-8919-bca6fe51cc06/extract-content/0.log" Dec 09 15:32:50 crc kubenswrapper[4555]: I1209 15:32:50.867911 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jsb8t_4e51fd14-b652-4c2f-8919-bca6fe51cc06/extract-utilities/0.log" Dec 09 15:32:50 crc kubenswrapper[4555]: I1209 15:32:50.901783 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sqk9k_f3b2ab95-c092-4739-8cbc-02e9dd6de6e7/extract-utilities/0.log" Dec 09 15:32:50 crc kubenswrapper[4555]: I1209 15:32:50.940709 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jsb8t_4e51fd14-b652-4c2f-8919-bca6fe51cc06/registry-server/0.log" Dec 09 15:32:51 crc kubenswrapper[4555]: I1209 15:32:51.042670 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sqk9k_f3b2ab95-c092-4739-8cbc-02e9dd6de6e7/extract-utilities/0.log" Dec 09 15:32:51 crc kubenswrapper[4555]: I1209 15:32:51.044424 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sqk9k_f3b2ab95-c092-4739-8cbc-02e9dd6de6e7/extract-content/0.log" Dec 09 15:32:51 crc kubenswrapper[4555]: I1209 15:32:51.046228 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sqk9k_f3b2ab95-c092-4739-8cbc-02e9dd6de6e7/extract-content/0.log" Dec 09 15:32:51 crc kubenswrapper[4555]: I1209 15:32:51.167622 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sqk9k_f3b2ab95-c092-4739-8cbc-02e9dd6de6e7/extract-content/0.log" Dec 09 15:32:51 crc kubenswrapper[4555]: I1209 15:32:51.168776 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sqk9k_f3b2ab95-c092-4739-8cbc-02e9dd6de6e7/extract-utilities/0.log" Dec 09 15:32:51 crc kubenswrapper[4555]: I1209 15:32:51.263108 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sqk9k_f3b2ab95-c092-4739-8cbc-02e9dd6de6e7/registry-server/0.log" Dec 09 15:32:58 crc kubenswrapper[4555]: I1209 15:32:58.689548 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-v4h96_ce4b9ae1-9c56-467a-908e-3b990e17f1cd/prometheus-operator/0.log" Dec 09 15:32:58 crc kubenswrapper[4555]: I1209 15:32:58.738748 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-79cb85764c-4sw2g_42ba541e-2ff1-45bb-85d9-9f4ee87e0bb9/prometheus-operator-admission-webhook/0.log" Dec 09 15:32:58 crc kubenswrapper[4555]: I1209 15:32:58.810997 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-79cb85764c-ff8sc_b486a37e-2de3-4639-b189-e69c501f9a6f/prometheus-operator-admission-webhook/0.log" Dec 09 15:32:58 crc kubenswrapper[4555]: I1209 15:32:58.866068 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-7l249_2302fd31-2414-419b-9733-4d69ef2d00fe/operator/0.log" Dec 09 15:32:58 crc kubenswrapper[4555]: I1209 15:32:58.949127 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-8rffd_7bd718ca-da15-4b6c-89ca-01b62bab1393/perses-operator/0.log" Dec 09 15:33:06 crc kubenswrapper[4555]: I1209 15:33:06.608277 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-8b59b96d9-kp25m_f6cf7440-1934-4f8f-9e77-22f1d3315527/kube-rbac-proxy/0.log" Dec 09 15:33:06 crc kubenswrapper[4555]: I1209 15:33:06.627078 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-8b59b96d9-kp25m_f6cf7440-1934-4f8f-9e77-22f1d3315527/manager/0.log" Dec 09 15:33:33 crc kubenswrapper[4555]: E1209 15:33:33.183904 4555 log.go:32] "PullImage from image service failed" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" image="38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05" Dec 09 15:33:33 crc kubenswrapper[4555]: E1209 15:33:33.184216 4555 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" image="38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05" Dec 09 15:33:33 crc kubenswrapper[4555]: E1209 15:33:33.184356 4555 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t2gln,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-operator-index-dqv6m_openstack-operators(d70d9237-a23b-4955-9f6d-1dee064186ff): ErrImagePull: rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" logger="UnhandledError" Dec 09 15:33:33 crc kubenswrapper[4555]: E1209 15:33:33.185565 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \\\"http://38.102.83.5:5001/v2/\\\": dial tcp 38.102.83.5:5001: i/o timeout\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:33:44 crc kubenswrapper[4555]: E1209 15:33:44.161939 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:33:47 crc kubenswrapper[4555]: I1209 15:33:47.132380 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kf4zp"] Dec 09 15:33:47 crc kubenswrapper[4555]: E1209 15:33:47.133697 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ef008b-d6cf-4632-9e29-c4c60f31cae6" containerName="extract-content" Dec 09 15:33:47 crc kubenswrapper[4555]: I1209 15:33:47.133720 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ef008b-d6cf-4632-9e29-c4c60f31cae6" containerName="extract-content" Dec 09 15:33:47 crc kubenswrapper[4555]: E1209 15:33:47.133734 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ef008b-d6cf-4632-9e29-c4c60f31cae6" containerName="registry-server" Dec 09 15:33:47 crc kubenswrapper[4555]: I1209 15:33:47.133740 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ef008b-d6cf-4632-9e29-c4c60f31cae6" containerName="registry-server" Dec 09 15:33:47 crc kubenswrapper[4555]: E1209 15:33:47.133758 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7606983e-d511-40d8-ac15-d41f98194c91" containerName="registry-server" Dec 09 15:33:47 crc kubenswrapper[4555]: I1209 15:33:47.133767 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="7606983e-d511-40d8-ac15-d41f98194c91" containerName="registry-server" Dec 09 15:33:47 crc kubenswrapper[4555]: E1209 15:33:47.133779 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7606983e-d511-40d8-ac15-d41f98194c91" containerName="extract-utilities" Dec 09 15:33:47 crc kubenswrapper[4555]: I1209 15:33:47.133784 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="7606983e-d511-40d8-ac15-d41f98194c91" containerName="extract-utilities" Dec 09 15:33:47 crc kubenswrapper[4555]: E1209 15:33:47.133805 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ef008b-d6cf-4632-9e29-c4c60f31cae6" containerName="extract-utilities" Dec 09 15:33:47 crc kubenswrapper[4555]: I1209 15:33:47.133811 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ef008b-d6cf-4632-9e29-c4c60f31cae6" containerName="extract-utilities" Dec 09 15:33:47 crc kubenswrapper[4555]: E1209 15:33:47.133824 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7606983e-d511-40d8-ac15-d41f98194c91" containerName="extract-content" Dec 09 15:33:47 crc kubenswrapper[4555]: I1209 15:33:47.133830 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="7606983e-d511-40d8-ac15-d41f98194c91" containerName="extract-content" Dec 09 15:33:47 crc kubenswrapper[4555]: I1209 15:33:47.134049 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="7606983e-d511-40d8-ac15-d41f98194c91" containerName="registry-server" Dec 09 15:33:47 crc kubenswrapper[4555]: I1209 15:33:47.134065 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="65ef008b-d6cf-4632-9e29-c4c60f31cae6" containerName="registry-server" Dec 09 15:33:47 crc kubenswrapper[4555]: I1209 15:33:47.136958 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kf4zp" Dec 09 15:33:47 crc kubenswrapper[4555]: I1209 15:33:47.148835 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kf4zp"] Dec 09 15:33:47 crc kubenswrapper[4555]: I1209 15:33:47.209662 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/505816b6-fafb-4be6-a45a-298e44ed9c0f-utilities\") pod \"certified-operators-kf4zp\" (UID: \"505816b6-fafb-4be6-a45a-298e44ed9c0f\") " pod="openshift-marketplace/certified-operators-kf4zp" Dec 09 15:33:47 crc kubenswrapper[4555]: I1209 15:33:47.209726 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/505816b6-fafb-4be6-a45a-298e44ed9c0f-catalog-content\") pod \"certified-operators-kf4zp\" (UID: \"505816b6-fafb-4be6-a45a-298e44ed9c0f\") " pod="openshift-marketplace/certified-operators-kf4zp" Dec 09 15:33:47 crc kubenswrapper[4555]: I1209 15:33:47.209858 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfgg6\" (UniqueName: \"kubernetes.io/projected/505816b6-fafb-4be6-a45a-298e44ed9c0f-kube-api-access-dfgg6\") pod \"certified-operators-kf4zp\" (UID: \"505816b6-fafb-4be6-a45a-298e44ed9c0f\") " pod="openshift-marketplace/certified-operators-kf4zp" Dec 09 15:33:47 crc kubenswrapper[4555]: I1209 15:33:47.311591 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/505816b6-fafb-4be6-a45a-298e44ed9c0f-utilities\") pod \"certified-operators-kf4zp\" (UID: \"505816b6-fafb-4be6-a45a-298e44ed9c0f\") " pod="openshift-marketplace/certified-operators-kf4zp" Dec 09 15:33:47 crc kubenswrapper[4555]: I1209 15:33:47.311637 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/505816b6-fafb-4be6-a45a-298e44ed9c0f-catalog-content\") pod \"certified-operators-kf4zp\" (UID: \"505816b6-fafb-4be6-a45a-298e44ed9c0f\") " pod="openshift-marketplace/certified-operators-kf4zp" Dec 09 15:33:47 crc kubenswrapper[4555]: I1209 15:33:47.311696 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfgg6\" (UniqueName: \"kubernetes.io/projected/505816b6-fafb-4be6-a45a-298e44ed9c0f-kube-api-access-dfgg6\") pod \"certified-operators-kf4zp\" (UID: \"505816b6-fafb-4be6-a45a-298e44ed9c0f\") " pod="openshift-marketplace/certified-operators-kf4zp" Dec 09 15:33:47 crc kubenswrapper[4555]: I1209 15:33:47.312183 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/505816b6-fafb-4be6-a45a-298e44ed9c0f-catalog-content\") pod \"certified-operators-kf4zp\" (UID: \"505816b6-fafb-4be6-a45a-298e44ed9c0f\") " pod="openshift-marketplace/certified-operators-kf4zp" Dec 09 15:33:47 crc kubenswrapper[4555]: I1209 15:33:47.312426 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/505816b6-fafb-4be6-a45a-298e44ed9c0f-utilities\") pod \"certified-operators-kf4zp\" (UID: \"505816b6-fafb-4be6-a45a-298e44ed9c0f\") " pod="openshift-marketplace/certified-operators-kf4zp" Dec 09 15:33:47 crc kubenswrapper[4555]: I1209 15:33:47.343542 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfgg6\" (UniqueName: \"kubernetes.io/projected/505816b6-fafb-4be6-a45a-298e44ed9c0f-kube-api-access-dfgg6\") pod \"certified-operators-kf4zp\" (UID: \"505816b6-fafb-4be6-a45a-298e44ed9c0f\") " pod="openshift-marketplace/certified-operators-kf4zp" Dec 09 15:33:47 crc kubenswrapper[4555]: I1209 15:33:47.459696 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kf4zp" Dec 09 15:33:47 crc kubenswrapper[4555]: I1209 15:33:47.863338 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kf4zp"] Dec 09 15:33:48 crc kubenswrapper[4555]: I1209 15:33:48.459552 4555 generic.go:334] "Generic (PLEG): container finished" podID="505816b6-fafb-4be6-a45a-298e44ed9c0f" containerID="27ee8ee99d488d6a662a169e61befd1edd26551dd1ea14ccc8a9536b2ba36470" exitCode=0 Dec 09 15:33:48 crc kubenswrapper[4555]: I1209 15:33:48.459640 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kf4zp" event={"ID":"505816b6-fafb-4be6-a45a-298e44ed9c0f","Type":"ContainerDied","Data":"27ee8ee99d488d6a662a169e61befd1edd26551dd1ea14ccc8a9536b2ba36470"} Dec 09 15:33:48 crc kubenswrapper[4555]: I1209 15:33:48.459877 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kf4zp" event={"ID":"505816b6-fafb-4be6-a45a-298e44ed9c0f","Type":"ContainerStarted","Data":"7e383d4f43a32defe1dae4f1268e1f9fb071c82a207a51b870d12f36a0d59b72"} Dec 09 15:33:48 crc kubenswrapper[4555]: I1209 15:33:48.461119 4555 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 15:33:48 crc kubenswrapper[4555]: I1209 15:33:48.462704 4555 generic.go:334] "Generic (PLEG): container finished" podID="b5f81540-f9fa-4ce5-9f64-2569567be218" containerID="915c2d43e943b70c7daed29490b0c0e013059eb5c006f36b492b68ed32c21a5f" exitCode=0 Dec 09 15:33:48 crc kubenswrapper[4555]: I1209 15:33:48.462743 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8b7f6/must-gather-sp9dk" event={"ID":"b5f81540-f9fa-4ce5-9f64-2569567be218","Type":"ContainerDied","Data":"915c2d43e943b70c7daed29490b0c0e013059eb5c006f36b492b68ed32c21a5f"} Dec 09 15:33:48 crc kubenswrapper[4555]: I1209 15:33:48.463458 4555 scope.go:117] "RemoveContainer" containerID="915c2d43e943b70c7daed29490b0c0e013059eb5c006f36b492b68ed32c21a5f" Dec 09 15:33:49 crc kubenswrapper[4555]: I1209 15:33:49.045266 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8b7f6_must-gather-sp9dk_b5f81540-f9fa-4ce5-9f64-2569567be218/gather/0.log" Dec 09 15:33:49 crc kubenswrapper[4555]: I1209 15:33:49.469681 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kf4zp" event={"ID":"505816b6-fafb-4be6-a45a-298e44ed9c0f","Type":"ContainerStarted","Data":"895fe58ce37ff4909364fd1215db3c8e401ed3669d4b59ca17f3e17d6ef7892c"} Dec 09 15:33:50 crc kubenswrapper[4555]: I1209 15:33:50.478782 4555 generic.go:334] "Generic (PLEG): container finished" podID="505816b6-fafb-4be6-a45a-298e44ed9c0f" containerID="895fe58ce37ff4909364fd1215db3c8e401ed3669d4b59ca17f3e17d6ef7892c" exitCode=0 Dec 09 15:33:50 crc kubenswrapper[4555]: I1209 15:33:50.478878 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kf4zp" event={"ID":"505816b6-fafb-4be6-a45a-298e44ed9c0f","Type":"ContainerDied","Data":"895fe58ce37ff4909364fd1215db3c8e401ed3669d4b59ca17f3e17d6ef7892c"} Dec 09 15:33:51 crc kubenswrapper[4555]: I1209 15:33:51.489983 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kf4zp" event={"ID":"505816b6-fafb-4be6-a45a-298e44ed9c0f","Type":"ContainerStarted","Data":"6a104e90d2fcd75656328c95979898291aedd47494d213fa11fff0209baa5190"} Dec 09 15:33:51 crc kubenswrapper[4555]: I1209 15:33:51.509197 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kf4zp" podStartSLOduration=1.9759245189999999 podStartE2EDuration="4.509181641s" podCreationTimestamp="2025-12-09 15:33:47 +0000 UTC" firstStartedPulling="2025-12-09 15:33:48.460921752 +0000 UTC m=+2138.597634471" lastFinishedPulling="2025-12-09 15:33:50.994178874 +0000 UTC m=+2141.130891593" observedRunningTime="2025-12-09 15:33:51.507002331 +0000 UTC m=+2141.643715050" watchObservedRunningTime="2025-12-09 15:33:51.509181641 +0000 UTC m=+2141.645894359" Dec 09 15:33:55 crc kubenswrapper[4555]: I1209 15:33:55.505162 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8b7f6/must-gather-sp9dk"] Dec 09 15:33:55 crc kubenswrapper[4555]: I1209 15:33:55.505732 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-8b7f6/must-gather-sp9dk" podUID="b5f81540-f9fa-4ce5-9f64-2569567be218" containerName="copy" containerID="cri-o://ff6e08d043a6c1546e83dde469b6fa814c44b37be05b7b4b1db29f0e82679da2" gracePeriod=2 Dec 09 15:33:55 crc kubenswrapper[4555]: I1209 15:33:55.511126 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8b7f6/must-gather-sp9dk"] Dec 09 15:33:55 crc kubenswrapper[4555]: I1209 15:33:55.531939 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:33:55 crc kubenswrapper[4555]: I1209 15:33:55.531988 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:33:55 crc kubenswrapper[4555]: I1209 15:33:55.820280 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8b7f6_must-gather-sp9dk_b5f81540-f9fa-4ce5-9f64-2569567be218/copy/0.log" Dec 09 15:33:55 crc kubenswrapper[4555]: I1209 15:33:55.820854 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8b7f6/must-gather-sp9dk" Dec 09 15:33:55 crc kubenswrapper[4555]: I1209 15:33:55.960029 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t958x\" (UniqueName: \"kubernetes.io/projected/b5f81540-f9fa-4ce5-9f64-2569567be218-kube-api-access-t958x\") pod \"b5f81540-f9fa-4ce5-9f64-2569567be218\" (UID: \"b5f81540-f9fa-4ce5-9f64-2569567be218\") " Dec 09 15:33:55 crc kubenswrapper[4555]: I1209 15:33:55.960286 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b5f81540-f9fa-4ce5-9f64-2569567be218-must-gather-output\") pod \"b5f81540-f9fa-4ce5-9f64-2569567be218\" (UID: \"b5f81540-f9fa-4ce5-9f64-2569567be218\") " Dec 09 15:33:55 crc kubenswrapper[4555]: I1209 15:33:55.964285 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5f81540-f9fa-4ce5-9f64-2569567be218-kube-api-access-t958x" (OuterVolumeSpecName: "kube-api-access-t958x") pod "b5f81540-f9fa-4ce5-9f64-2569567be218" (UID: "b5f81540-f9fa-4ce5-9f64-2569567be218"). InnerVolumeSpecName "kube-api-access-t958x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:33:55 crc kubenswrapper[4555]: I1209 15:33:55.969271 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t958x\" (UniqueName: \"kubernetes.io/projected/b5f81540-f9fa-4ce5-9f64-2569567be218-kube-api-access-t958x\") on node \"crc\" DevicePath \"\"" Dec 09 15:33:56 crc kubenswrapper[4555]: I1209 15:33:56.011065 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5f81540-f9fa-4ce5-9f64-2569567be218-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "b5f81540-f9fa-4ce5-9f64-2569567be218" (UID: "b5f81540-f9fa-4ce5-9f64-2569567be218"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:33:56 crc kubenswrapper[4555]: I1209 15:33:56.071211 4555 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b5f81540-f9fa-4ce5-9f64-2569567be218-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 09 15:33:56 crc kubenswrapper[4555]: I1209 15:33:56.168172 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5f81540-f9fa-4ce5-9f64-2569567be218" path="/var/lib/kubelet/pods/b5f81540-f9fa-4ce5-9f64-2569567be218/volumes" Dec 09 15:33:56 crc kubenswrapper[4555]: I1209 15:33:56.518329 4555 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8b7f6_must-gather-sp9dk_b5f81540-f9fa-4ce5-9f64-2569567be218/copy/0.log" Dec 09 15:33:56 crc kubenswrapper[4555]: I1209 15:33:56.518666 4555 generic.go:334] "Generic (PLEG): container finished" podID="b5f81540-f9fa-4ce5-9f64-2569567be218" containerID="ff6e08d043a6c1546e83dde469b6fa814c44b37be05b7b4b1db29f0e82679da2" exitCode=143 Dec 09 15:33:56 crc kubenswrapper[4555]: I1209 15:33:56.518706 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8b7f6/must-gather-sp9dk" Dec 09 15:33:56 crc kubenswrapper[4555]: I1209 15:33:56.518714 4555 scope.go:117] "RemoveContainer" containerID="ff6e08d043a6c1546e83dde469b6fa814c44b37be05b7b4b1db29f0e82679da2" Dec 09 15:33:56 crc kubenswrapper[4555]: I1209 15:33:56.532656 4555 scope.go:117] "RemoveContainer" containerID="915c2d43e943b70c7daed29490b0c0e013059eb5c006f36b492b68ed32c21a5f" Dec 09 15:33:56 crc kubenswrapper[4555]: I1209 15:33:56.561020 4555 scope.go:117] "RemoveContainer" containerID="ff6e08d043a6c1546e83dde469b6fa814c44b37be05b7b4b1db29f0e82679da2" Dec 09 15:33:56 crc kubenswrapper[4555]: E1209 15:33:56.561402 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff6e08d043a6c1546e83dde469b6fa814c44b37be05b7b4b1db29f0e82679da2\": container with ID starting with ff6e08d043a6c1546e83dde469b6fa814c44b37be05b7b4b1db29f0e82679da2 not found: ID does not exist" containerID="ff6e08d043a6c1546e83dde469b6fa814c44b37be05b7b4b1db29f0e82679da2" Dec 09 15:33:56 crc kubenswrapper[4555]: I1209 15:33:56.561433 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff6e08d043a6c1546e83dde469b6fa814c44b37be05b7b4b1db29f0e82679da2"} err="failed to get container status \"ff6e08d043a6c1546e83dde469b6fa814c44b37be05b7b4b1db29f0e82679da2\": rpc error: code = NotFound desc = could not find container \"ff6e08d043a6c1546e83dde469b6fa814c44b37be05b7b4b1db29f0e82679da2\": container with ID starting with ff6e08d043a6c1546e83dde469b6fa814c44b37be05b7b4b1db29f0e82679da2 not found: ID does not exist" Dec 09 15:33:56 crc kubenswrapper[4555]: I1209 15:33:56.561454 4555 scope.go:117] "RemoveContainer" containerID="915c2d43e943b70c7daed29490b0c0e013059eb5c006f36b492b68ed32c21a5f" Dec 09 15:33:56 crc kubenswrapper[4555]: E1209 15:33:56.561624 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"915c2d43e943b70c7daed29490b0c0e013059eb5c006f36b492b68ed32c21a5f\": container with ID starting with 915c2d43e943b70c7daed29490b0c0e013059eb5c006f36b492b68ed32c21a5f not found: ID does not exist" containerID="915c2d43e943b70c7daed29490b0c0e013059eb5c006f36b492b68ed32c21a5f" Dec 09 15:33:56 crc kubenswrapper[4555]: I1209 15:33:56.561642 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"915c2d43e943b70c7daed29490b0c0e013059eb5c006f36b492b68ed32c21a5f"} err="failed to get container status \"915c2d43e943b70c7daed29490b0c0e013059eb5c006f36b492b68ed32c21a5f\": rpc error: code = NotFound desc = could not find container \"915c2d43e943b70c7daed29490b0c0e013059eb5c006f36b492b68ed32c21a5f\": container with ID starting with 915c2d43e943b70c7daed29490b0c0e013059eb5c006f36b492b68ed32c21a5f not found: ID does not exist" Dec 09 15:33:57 crc kubenswrapper[4555]: E1209 15:33:57.162085 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:33:57 crc kubenswrapper[4555]: I1209 15:33:57.460593 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kf4zp" Dec 09 15:33:57 crc kubenswrapper[4555]: I1209 15:33:57.460626 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kf4zp" Dec 09 15:33:57 crc kubenswrapper[4555]: I1209 15:33:57.491217 4555 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kf4zp" Dec 09 15:33:57 crc kubenswrapper[4555]: I1209 15:33:57.552156 4555 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kf4zp" Dec 09 15:33:57 crc kubenswrapper[4555]: I1209 15:33:57.781593 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kf4zp"] Dec 09 15:33:59 crc kubenswrapper[4555]: I1209 15:33:59.534685 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kf4zp" podUID="505816b6-fafb-4be6-a45a-298e44ed9c0f" containerName="registry-server" containerID="cri-o://6a104e90d2fcd75656328c95979898291aedd47494d213fa11fff0209baa5190" gracePeriod=2 Dec 09 15:33:59 crc kubenswrapper[4555]: I1209 15:33:59.867540 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kf4zp" Dec 09 15:33:59 crc kubenswrapper[4555]: I1209 15:33:59.923495 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/505816b6-fafb-4be6-a45a-298e44ed9c0f-utilities\") pod \"505816b6-fafb-4be6-a45a-298e44ed9c0f\" (UID: \"505816b6-fafb-4be6-a45a-298e44ed9c0f\") " Dec 09 15:33:59 crc kubenswrapper[4555]: I1209 15:33:59.923622 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfgg6\" (UniqueName: \"kubernetes.io/projected/505816b6-fafb-4be6-a45a-298e44ed9c0f-kube-api-access-dfgg6\") pod \"505816b6-fafb-4be6-a45a-298e44ed9c0f\" (UID: \"505816b6-fafb-4be6-a45a-298e44ed9c0f\") " Dec 09 15:33:59 crc kubenswrapper[4555]: I1209 15:33:59.923722 4555 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/505816b6-fafb-4be6-a45a-298e44ed9c0f-catalog-content\") pod \"505816b6-fafb-4be6-a45a-298e44ed9c0f\" (UID: \"505816b6-fafb-4be6-a45a-298e44ed9c0f\") " Dec 09 15:33:59 crc kubenswrapper[4555]: I1209 15:33:59.924146 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/505816b6-fafb-4be6-a45a-298e44ed9c0f-utilities" (OuterVolumeSpecName: "utilities") pod "505816b6-fafb-4be6-a45a-298e44ed9c0f" (UID: "505816b6-fafb-4be6-a45a-298e44ed9c0f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:33:59 crc kubenswrapper[4555]: I1209 15:33:59.934736 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/505816b6-fafb-4be6-a45a-298e44ed9c0f-kube-api-access-dfgg6" (OuterVolumeSpecName: "kube-api-access-dfgg6") pod "505816b6-fafb-4be6-a45a-298e44ed9c0f" (UID: "505816b6-fafb-4be6-a45a-298e44ed9c0f"). InnerVolumeSpecName "kube-api-access-dfgg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:34:00 crc kubenswrapper[4555]: I1209 15:34:00.024904 4555 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/505816b6-fafb-4be6-a45a-298e44ed9c0f-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:34:00 crc kubenswrapper[4555]: I1209 15:34:00.025109 4555 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfgg6\" (UniqueName: \"kubernetes.io/projected/505816b6-fafb-4be6-a45a-298e44ed9c0f-kube-api-access-dfgg6\") on node \"crc\" DevicePath \"\"" Dec 09 15:34:00 crc kubenswrapper[4555]: I1209 15:34:00.185311 4555 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/505816b6-fafb-4be6-a45a-298e44ed9c0f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "505816b6-fafb-4be6-a45a-298e44ed9c0f" (UID: "505816b6-fafb-4be6-a45a-298e44ed9c0f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:34:00 crc kubenswrapper[4555]: I1209 15:34:00.227754 4555 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/505816b6-fafb-4be6-a45a-298e44ed9c0f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:34:00 crc kubenswrapper[4555]: I1209 15:34:00.549378 4555 generic.go:334] "Generic (PLEG): container finished" podID="505816b6-fafb-4be6-a45a-298e44ed9c0f" containerID="6a104e90d2fcd75656328c95979898291aedd47494d213fa11fff0209baa5190" exitCode=0 Dec 09 15:34:00 crc kubenswrapper[4555]: I1209 15:34:00.549423 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kf4zp" event={"ID":"505816b6-fafb-4be6-a45a-298e44ed9c0f","Type":"ContainerDied","Data":"6a104e90d2fcd75656328c95979898291aedd47494d213fa11fff0209baa5190"} Dec 09 15:34:00 crc kubenswrapper[4555]: I1209 15:34:00.549448 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kf4zp" event={"ID":"505816b6-fafb-4be6-a45a-298e44ed9c0f","Type":"ContainerDied","Data":"7e383d4f43a32defe1dae4f1268e1f9fb071c82a207a51b870d12f36a0d59b72"} Dec 09 15:34:00 crc kubenswrapper[4555]: I1209 15:34:00.549457 4555 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kf4zp" Dec 09 15:34:00 crc kubenswrapper[4555]: I1209 15:34:00.549467 4555 scope.go:117] "RemoveContainer" containerID="6a104e90d2fcd75656328c95979898291aedd47494d213fa11fff0209baa5190" Dec 09 15:34:00 crc kubenswrapper[4555]: I1209 15:34:00.567445 4555 scope.go:117] "RemoveContainer" containerID="895fe58ce37ff4909364fd1215db3c8e401ed3669d4b59ca17f3e17d6ef7892c" Dec 09 15:34:00 crc kubenswrapper[4555]: I1209 15:34:00.580703 4555 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kf4zp"] Dec 09 15:34:00 crc kubenswrapper[4555]: I1209 15:34:00.582731 4555 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kf4zp"] Dec 09 15:34:00 crc kubenswrapper[4555]: I1209 15:34:00.585200 4555 scope.go:117] "RemoveContainer" containerID="27ee8ee99d488d6a662a169e61befd1edd26551dd1ea14ccc8a9536b2ba36470" Dec 09 15:34:00 crc kubenswrapper[4555]: I1209 15:34:00.629361 4555 scope.go:117] "RemoveContainer" containerID="6a104e90d2fcd75656328c95979898291aedd47494d213fa11fff0209baa5190" Dec 09 15:34:00 crc kubenswrapper[4555]: E1209 15:34:00.630459 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a104e90d2fcd75656328c95979898291aedd47494d213fa11fff0209baa5190\": container with ID starting with 6a104e90d2fcd75656328c95979898291aedd47494d213fa11fff0209baa5190 not found: ID does not exist" containerID="6a104e90d2fcd75656328c95979898291aedd47494d213fa11fff0209baa5190" Dec 09 15:34:00 crc kubenswrapper[4555]: I1209 15:34:00.630498 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a104e90d2fcd75656328c95979898291aedd47494d213fa11fff0209baa5190"} err="failed to get container status \"6a104e90d2fcd75656328c95979898291aedd47494d213fa11fff0209baa5190\": rpc error: code = NotFound desc = could not find container \"6a104e90d2fcd75656328c95979898291aedd47494d213fa11fff0209baa5190\": container with ID starting with 6a104e90d2fcd75656328c95979898291aedd47494d213fa11fff0209baa5190 not found: ID does not exist" Dec 09 15:34:00 crc kubenswrapper[4555]: I1209 15:34:00.630526 4555 scope.go:117] "RemoveContainer" containerID="895fe58ce37ff4909364fd1215db3c8e401ed3669d4b59ca17f3e17d6ef7892c" Dec 09 15:34:00 crc kubenswrapper[4555]: E1209 15:34:00.630851 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"895fe58ce37ff4909364fd1215db3c8e401ed3669d4b59ca17f3e17d6ef7892c\": container with ID starting with 895fe58ce37ff4909364fd1215db3c8e401ed3669d4b59ca17f3e17d6ef7892c not found: ID does not exist" containerID="895fe58ce37ff4909364fd1215db3c8e401ed3669d4b59ca17f3e17d6ef7892c" Dec 09 15:34:00 crc kubenswrapper[4555]: I1209 15:34:00.630887 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"895fe58ce37ff4909364fd1215db3c8e401ed3669d4b59ca17f3e17d6ef7892c"} err="failed to get container status \"895fe58ce37ff4909364fd1215db3c8e401ed3669d4b59ca17f3e17d6ef7892c\": rpc error: code = NotFound desc = could not find container \"895fe58ce37ff4909364fd1215db3c8e401ed3669d4b59ca17f3e17d6ef7892c\": container with ID starting with 895fe58ce37ff4909364fd1215db3c8e401ed3669d4b59ca17f3e17d6ef7892c not found: ID does not exist" Dec 09 15:34:00 crc kubenswrapper[4555]: I1209 15:34:00.630914 4555 scope.go:117] "RemoveContainer" containerID="27ee8ee99d488d6a662a169e61befd1edd26551dd1ea14ccc8a9536b2ba36470" Dec 09 15:34:00 crc kubenswrapper[4555]: E1209 15:34:00.631339 4555 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27ee8ee99d488d6a662a169e61befd1edd26551dd1ea14ccc8a9536b2ba36470\": container with ID starting with 27ee8ee99d488d6a662a169e61befd1edd26551dd1ea14ccc8a9536b2ba36470 not found: ID does not exist" containerID="27ee8ee99d488d6a662a169e61befd1edd26551dd1ea14ccc8a9536b2ba36470" Dec 09 15:34:00 crc kubenswrapper[4555]: I1209 15:34:00.631364 4555 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27ee8ee99d488d6a662a169e61befd1edd26551dd1ea14ccc8a9536b2ba36470"} err="failed to get container status \"27ee8ee99d488d6a662a169e61befd1edd26551dd1ea14ccc8a9536b2ba36470\": rpc error: code = NotFound desc = could not find container \"27ee8ee99d488d6a662a169e61befd1edd26551dd1ea14ccc8a9536b2ba36470\": container with ID starting with 27ee8ee99d488d6a662a169e61befd1edd26551dd1ea14ccc8a9536b2ba36470 not found: ID does not exist" Dec 09 15:34:02 crc kubenswrapper[4555]: I1209 15:34:02.167501 4555 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="505816b6-fafb-4be6-a45a-298e44ed9c0f" path="/var/lib/kubelet/pods/505816b6-fafb-4be6-a45a-298e44ed9c0f/volumes" Dec 09 15:34:10 crc kubenswrapper[4555]: E1209 15:34:10.165257 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:34:25 crc kubenswrapper[4555]: E1209 15:34:25.162955 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:34:25 crc kubenswrapper[4555]: I1209 15:34:25.531492 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:34:25 crc kubenswrapper[4555]: I1209 15:34:25.531548 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:34:40 crc kubenswrapper[4555]: E1209 15:34:40.165260 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:34:54 crc kubenswrapper[4555]: E1209 15:34:54.162694 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:34:55 crc kubenswrapper[4555]: I1209 15:34:55.531666 4555 patch_prober.go:28] interesting pod/machine-config-daemon-x6p6q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:34:55 crc kubenswrapper[4555]: I1209 15:34:55.531863 4555 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:34:55 crc kubenswrapper[4555]: I1209 15:34:55.531907 4555 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" Dec 09 15:34:55 crc kubenswrapper[4555]: I1209 15:34:55.532519 4555 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b9189c1bb7d2ed40234f5435fa027fe8f9d04f85cada8d567cbb9cf18212bea4"} pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 15:34:55 crc kubenswrapper[4555]: I1209 15:34:55.532580 4555 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerName="machine-config-daemon" containerID="cri-o://b9189c1bb7d2ed40234f5435fa027fe8f9d04f85cada8d567cbb9cf18212bea4" gracePeriod=600 Dec 09 15:34:55 crc kubenswrapper[4555]: E1209 15:34:55.652026 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:34:55 crc kubenswrapper[4555]: I1209 15:34:55.838090 4555 generic.go:334] "Generic (PLEG): container finished" podID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" containerID="b9189c1bb7d2ed40234f5435fa027fe8f9d04f85cada8d567cbb9cf18212bea4" exitCode=0 Dec 09 15:34:55 crc kubenswrapper[4555]: I1209 15:34:55.838129 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" event={"ID":"b929ab12-7d60-4e9f-a1a7-3e2d8b086d36","Type":"ContainerDied","Data":"b9189c1bb7d2ed40234f5435fa027fe8f9d04f85cada8d567cbb9cf18212bea4"} Dec 09 15:34:55 crc kubenswrapper[4555]: I1209 15:34:55.838162 4555 scope.go:117] "RemoveContainer" containerID="e4bc645e4f69b0b3b14ca33644fa6151ef72f410fa63d62a5caa76a2af6d0c75" Dec 09 15:34:55 crc kubenswrapper[4555]: I1209 15:34:55.838773 4555 scope.go:117] "RemoveContainer" containerID="b9189c1bb7d2ed40234f5435fa027fe8f9d04f85cada8d567cbb9cf18212bea4" Dec 09 15:34:55 crc kubenswrapper[4555]: E1209 15:34:55.839032 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:35:07 crc kubenswrapper[4555]: E1209 15:35:07.162881 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:35:09 crc kubenswrapper[4555]: I1209 15:35:09.161476 4555 scope.go:117] "RemoveContainer" containerID="b9189c1bb7d2ed40234f5435fa027fe8f9d04f85cada8d567cbb9cf18212bea4" Dec 09 15:35:09 crc kubenswrapper[4555]: E1209 15:35:09.161889 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:35:20 crc kubenswrapper[4555]: E1209 15:35:20.165789 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:35:23 crc kubenswrapper[4555]: I1209 15:35:23.161711 4555 scope.go:117] "RemoveContainer" containerID="b9189c1bb7d2ed40234f5435fa027fe8f9d04f85cada8d567cbb9cf18212bea4" Dec 09 15:35:23 crc kubenswrapper[4555]: E1209 15:35:23.162220 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:35:32 crc kubenswrapper[4555]: E1209 15:35:32.163069 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:35:35 crc kubenswrapper[4555]: I1209 15:35:35.160645 4555 scope.go:117] "RemoveContainer" containerID="b9189c1bb7d2ed40234f5435fa027fe8f9d04f85cada8d567cbb9cf18212bea4" Dec 09 15:35:35 crc kubenswrapper[4555]: E1209 15:35:35.161245 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:35:45 crc kubenswrapper[4555]: E1209 15:35:45.162219 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:35:46 crc kubenswrapper[4555]: I1209 15:35:46.161236 4555 scope.go:117] "RemoveContainer" containerID="b9189c1bb7d2ed40234f5435fa027fe8f9d04f85cada8d567cbb9cf18212bea4" Dec 09 15:35:46 crc kubenswrapper[4555]: E1209 15:35:46.161623 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:36:00 crc kubenswrapper[4555]: E1209 15:36:00.165629 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:36:01 crc kubenswrapper[4555]: I1209 15:36:01.161559 4555 scope.go:117] "RemoveContainer" containerID="b9189c1bb7d2ed40234f5435fa027fe8f9d04f85cada8d567cbb9cf18212bea4" Dec 09 15:36:01 crc kubenswrapper[4555]: E1209 15:36:01.161840 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:36:12 crc kubenswrapper[4555]: I1209 15:36:12.161383 4555 scope.go:117] "RemoveContainer" containerID="b9189c1bb7d2ed40234f5435fa027fe8f9d04f85cada8d567cbb9cf18212bea4" Dec 09 15:36:12 crc kubenswrapper[4555]: E1209 15:36:12.162745 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:36:15 crc kubenswrapper[4555]: E1209 15:36:15.162682 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:36:24 crc kubenswrapper[4555]: I1209 15:36:24.161116 4555 scope.go:117] "RemoveContainer" containerID="b9189c1bb7d2ed40234f5435fa027fe8f9d04f85cada8d567cbb9cf18212bea4" Dec 09 15:36:24 crc kubenswrapper[4555]: E1209 15:36:24.161659 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:36:26 crc kubenswrapper[4555]: E1209 15:36:26.162378 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:36:39 crc kubenswrapper[4555]: I1209 15:36:39.160645 4555 scope.go:117] "RemoveContainer" containerID="b9189c1bb7d2ed40234f5435fa027fe8f9d04f85cada8d567cbb9cf18212bea4" Dec 09 15:36:39 crc kubenswrapper[4555]: E1209 15:36:39.161177 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:36:41 crc kubenswrapper[4555]: E1209 15:36:41.162727 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:36:52 crc kubenswrapper[4555]: E1209 15:36:52.162284 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:36:54 crc kubenswrapper[4555]: I1209 15:36:54.161145 4555 scope.go:117] "RemoveContainer" containerID="b9189c1bb7d2ed40234f5435fa027fe8f9d04f85cada8d567cbb9cf18212bea4" Dec 09 15:36:54 crc kubenswrapper[4555]: E1209 15:36:54.161594 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:37:03 crc kubenswrapper[4555]: E1209 15:37:03.161855 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:37:09 crc kubenswrapper[4555]: I1209 15:37:09.161573 4555 scope.go:117] "RemoveContainer" containerID="b9189c1bb7d2ed40234f5435fa027fe8f9d04f85cada8d567cbb9cf18212bea4" Dec 09 15:37:09 crc kubenswrapper[4555]: E1209 15:37:09.162055 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:37:14 crc kubenswrapper[4555]: E1209 15:37:14.161997 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:37:23 crc kubenswrapper[4555]: I1209 15:37:23.160877 4555 scope.go:117] "RemoveContainer" containerID="b9189c1bb7d2ed40234f5435fa027fe8f9d04f85cada8d567cbb9cf18212bea4" Dec 09 15:37:23 crc kubenswrapper[4555]: E1209 15:37:23.161515 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36" Dec 09 15:37:25 crc kubenswrapper[4555]: E1209 15:37:25.162083 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:37:30 crc kubenswrapper[4555]: I1209 15:37:30.150724 4555 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ndtc6"] Dec 09 15:37:30 crc kubenswrapper[4555]: E1209 15:37:30.151334 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f81540-f9fa-4ce5-9f64-2569567be218" containerName="gather" Dec 09 15:37:30 crc kubenswrapper[4555]: I1209 15:37:30.151347 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f81540-f9fa-4ce5-9f64-2569567be218" containerName="gather" Dec 09 15:37:30 crc kubenswrapper[4555]: E1209 15:37:30.151356 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="505816b6-fafb-4be6-a45a-298e44ed9c0f" containerName="extract-content" Dec 09 15:37:30 crc kubenswrapper[4555]: I1209 15:37:30.151363 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="505816b6-fafb-4be6-a45a-298e44ed9c0f" containerName="extract-content" Dec 09 15:37:30 crc kubenswrapper[4555]: E1209 15:37:30.151372 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="505816b6-fafb-4be6-a45a-298e44ed9c0f" containerName="extract-utilities" Dec 09 15:37:30 crc kubenswrapper[4555]: I1209 15:37:30.151379 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="505816b6-fafb-4be6-a45a-298e44ed9c0f" containerName="extract-utilities" Dec 09 15:37:30 crc kubenswrapper[4555]: E1209 15:37:30.151388 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="505816b6-fafb-4be6-a45a-298e44ed9c0f" containerName="registry-server" Dec 09 15:37:30 crc kubenswrapper[4555]: I1209 15:37:30.151394 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="505816b6-fafb-4be6-a45a-298e44ed9c0f" containerName="registry-server" Dec 09 15:37:30 crc kubenswrapper[4555]: E1209 15:37:30.151414 4555 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f81540-f9fa-4ce5-9f64-2569567be218" containerName="copy" Dec 09 15:37:30 crc kubenswrapper[4555]: I1209 15:37:30.151419 4555 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f81540-f9fa-4ce5-9f64-2569567be218" containerName="copy" Dec 09 15:37:30 crc kubenswrapper[4555]: I1209 15:37:30.151557 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="505816b6-fafb-4be6-a45a-298e44ed9c0f" containerName="registry-server" Dec 09 15:37:30 crc kubenswrapper[4555]: I1209 15:37:30.151565 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5f81540-f9fa-4ce5-9f64-2569567be218" containerName="gather" Dec 09 15:37:30 crc kubenswrapper[4555]: I1209 15:37:30.151570 4555 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5f81540-f9fa-4ce5-9f64-2569567be218" containerName="copy" Dec 09 15:37:30 crc kubenswrapper[4555]: I1209 15:37:30.152489 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ndtc6" Dec 09 15:37:30 crc kubenswrapper[4555]: I1209 15:37:30.159796 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ndtc6"] Dec 09 15:37:30 crc kubenswrapper[4555]: I1209 15:37:30.235578 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxkq6\" (UniqueName: \"kubernetes.io/projected/16e72cbc-9f22-4f0d-b3cb-ddd8ab7738a2-kube-api-access-pxkq6\") pod \"redhat-operators-ndtc6\" (UID: \"16e72cbc-9f22-4f0d-b3cb-ddd8ab7738a2\") " pod="openshift-marketplace/redhat-operators-ndtc6" Dec 09 15:37:30 crc kubenswrapper[4555]: I1209 15:37:30.235704 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16e72cbc-9f22-4f0d-b3cb-ddd8ab7738a2-utilities\") pod \"redhat-operators-ndtc6\" (UID: \"16e72cbc-9f22-4f0d-b3cb-ddd8ab7738a2\") " pod="openshift-marketplace/redhat-operators-ndtc6" Dec 09 15:37:30 crc kubenswrapper[4555]: I1209 15:37:30.235741 4555 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16e72cbc-9f22-4f0d-b3cb-ddd8ab7738a2-catalog-content\") pod \"redhat-operators-ndtc6\" (UID: \"16e72cbc-9f22-4f0d-b3cb-ddd8ab7738a2\") " pod="openshift-marketplace/redhat-operators-ndtc6" Dec 09 15:37:30 crc kubenswrapper[4555]: I1209 15:37:30.336961 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxkq6\" (UniqueName: \"kubernetes.io/projected/16e72cbc-9f22-4f0d-b3cb-ddd8ab7738a2-kube-api-access-pxkq6\") pod \"redhat-operators-ndtc6\" (UID: \"16e72cbc-9f22-4f0d-b3cb-ddd8ab7738a2\") " pod="openshift-marketplace/redhat-operators-ndtc6" Dec 09 15:37:30 crc kubenswrapper[4555]: I1209 15:37:30.337087 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16e72cbc-9f22-4f0d-b3cb-ddd8ab7738a2-utilities\") pod \"redhat-operators-ndtc6\" (UID: \"16e72cbc-9f22-4f0d-b3cb-ddd8ab7738a2\") " pod="openshift-marketplace/redhat-operators-ndtc6" Dec 09 15:37:30 crc kubenswrapper[4555]: I1209 15:37:30.337129 4555 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16e72cbc-9f22-4f0d-b3cb-ddd8ab7738a2-catalog-content\") pod \"redhat-operators-ndtc6\" (UID: \"16e72cbc-9f22-4f0d-b3cb-ddd8ab7738a2\") " pod="openshift-marketplace/redhat-operators-ndtc6" Dec 09 15:37:30 crc kubenswrapper[4555]: I1209 15:37:30.337606 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16e72cbc-9f22-4f0d-b3cb-ddd8ab7738a2-utilities\") pod \"redhat-operators-ndtc6\" (UID: \"16e72cbc-9f22-4f0d-b3cb-ddd8ab7738a2\") " pod="openshift-marketplace/redhat-operators-ndtc6" Dec 09 15:37:30 crc kubenswrapper[4555]: I1209 15:37:30.337609 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16e72cbc-9f22-4f0d-b3cb-ddd8ab7738a2-catalog-content\") pod \"redhat-operators-ndtc6\" (UID: \"16e72cbc-9f22-4f0d-b3cb-ddd8ab7738a2\") " pod="openshift-marketplace/redhat-operators-ndtc6" Dec 09 15:37:30 crc kubenswrapper[4555]: I1209 15:37:30.353821 4555 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxkq6\" (UniqueName: \"kubernetes.io/projected/16e72cbc-9f22-4f0d-b3cb-ddd8ab7738a2-kube-api-access-pxkq6\") pod \"redhat-operators-ndtc6\" (UID: \"16e72cbc-9f22-4f0d-b3cb-ddd8ab7738a2\") " pod="openshift-marketplace/redhat-operators-ndtc6" Dec 09 15:37:30 crc kubenswrapper[4555]: I1209 15:37:30.471044 4555 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ndtc6" Dec 09 15:37:30 crc kubenswrapper[4555]: W1209 15:37:30.864071 4555 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16e72cbc_9f22_4f0d_b3cb_ddd8ab7738a2.slice/crio-d726ea23308e5106db45ccc502c181041ddb1fb8e5e0a8b039850d6db5be2cfa WatchSource:0}: Error finding container d726ea23308e5106db45ccc502c181041ddb1fb8e5e0a8b039850d6db5be2cfa: Status 404 returned error can't find the container with id d726ea23308e5106db45ccc502c181041ddb1fb8e5e0a8b039850d6db5be2cfa Dec 09 15:37:30 crc kubenswrapper[4555]: I1209 15:37:30.876437 4555 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ndtc6"] Dec 09 15:37:31 crc kubenswrapper[4555]: I1209 15:37:31.649564 4555 generic.go:334] "Generic (PLEG): container finished" podID="16e72cbc-9f22-4f0d-b3cb-ddd8ab7738a2" containerID="00877d28886e3bc9bb40a28965041628017c3534a057cd2ae59a9e90e72d29c8" exitCode=0 Dec 09 15:37:31 crc kubenswrapper[4555]: I1209 15:37:31.649612 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ndtc6" event={"ID":"16e72cbc-9f22-4f0d-b3cb-ddd8ab7738a2","Type":"ContainerDied","Data":"00877d28886e3bc9bb40a28965041628017c3534a057cd2ae59a9e90e72d29c8"} Dec 09 15:37:31 crc kubenswrapper[4555]: I1209 15:37:31.649783 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ndtc6" event={"ID":"16e72cbc-9f22-4f0d-b3cb-ddd8ab7738a2","Type":"ContainerStarted","Data":"d726ea23308e5106db45ccc502c181041ddb1fb8e5e0a8b039850d6db5be2cfa"} Dec 09 15:37:32 crc kubenswrapper[4555]: I1209 15:37:32.656375 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ndtc6" event={"ID":"16e72cbc-9f22-4f0d-b3cb-ddd8ab7738a2","Type":"ContainerStarted","Data":"cd902daba66e6fe503db610fc51d5f317de9b685835d17ba01253c1726799760"} Dec 09 15:37:33 crc kubenswrapper[4555]: I1209 15:37:33.663094 4555 generic.go:334] "Generic (PLEG): container finished" podID="16e72cbc-9f22-4f0d-b3cb-ddd8ab7738a2" containerID="cd902daba66e6fe503db610fc51d5f317de9b685835d17ba01253c1726799760" exitCode=0 Dec 09 15:37:33 crc kubenswrapper[4555]: I1209 15:37:33.663171 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ndtc6" event={"ID":"16e72cbc-9f22-4f0d-b3cb-ddd8ab7738a2","Type":"ContainerDied","Data":"cd902daba66e6fe503db610fc51d5f317de9b685835d17ba01253c1726799760"} Dec 09 15:37:34 crc kubenswrapper[4555]: I1209 15:37:34.671435 4555 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ndtc6" event={"ID":"16e72cbc-9f22-4f0d-b3cb-ddd8ab7738a2","Type":"ContainerStarted","Data":"4244e435710924fcbeee0326fffe907580e5cbe9759603c53bce066fcebdbb02"} Dec 09 15:37:34 crc kubenswrapper[4555]: I1209 15:37:34.684712 4555 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ndtc6" podStartSLOduration=2.1721550609999998 podStartE2EDuration="4.684699037s" podCreationTimestamp="2025-12-09 15:37:30 +0000 UTC" firstStartedPulling="2025-12-09 15:37:31.651199112 +0000 UTC m=+2361.787911830" lastFinishedPulling="2025-12-09 15:37:34.163743088 +0000 UTC m=+2364.300455806" observedRunningTime="2025-12-09 15:37:34.682555254 +0000 UTC m=+2364.819267982" watchObservedRunningTime="2025-12-09 15:37:34.684699037 +0000 UTC m=+2364.821411754" Dec 09 15:37:38 crc kubenswrapper[4555]: I1209 15:37:38.161102 4555 scope.go:117] "RemoveContainer" containerID="b9189c1bb7d2ed40234f5435fa027fe8f9d04f85cada8d567cbb9cf18212bea4" Dec 09 15:37:38 crc kubenswrapper[4555]: E1209 15:37:38.162214 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-dqv6m" podUID="d70d9237-a23b-4955-9f6d-1dee064186ff" Dec 09 15:37:38 crc kubenswrapper[4555]: E1209 15:37:38.162469 4555 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6p6q_openshift-machine-config-operator(b929ab12-7d60-4e9f-a1a7-3e2d8b086d36)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6p6q" podUID="b929ab12-7d60-4e9f-a1a7-3e2d8b086d36"